In this study, we investigate the use of machine learning and text mining techniques to classify and analyze TED talk videos based on their transcript data. We perform sentiment analysis to identify the opinions and feelings expressed by TED speakers about each talk topic, and use topic analysis to cluster the videos and compare them to the categories labeled by the TED website. We also apply text classification techniques to predict the topics of new videos using a random forest model. Our results show that TED talks tend to present a positive sentiment and that the clusters generated by Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) align closely with the known categories. The supervised learning model using a combination of LSA on Term Frequency-Inverse Document Frequency (TF-IDF) and additional information had the highest accuracy, with an overall accuracy of 0.90. Limitations and potential directions for future research are also discussed.
There are several ways to learn and share knowledge today, one of the most popular being short video clips. We are interested in using both machine learning and text mining techniques to analyze the classification of videos from their transcript or subtitle data. Initially, we considered several streaming websites and video podcasts, such as YouTube, BBC Learning English, Apple podcasts, and TED talk. TED was ultimately chosen for this project because of the availability of data for our study and its wide range of videos in terms of topics, languages, and lengths. Furthermore, each TED video is labeled with a relevant category and includes a transcript.
The goal of this project is to first use sentiment analysis to identify opinions, judgments, or feelings expressed by TED speakers about each TED talk topic. Second, we will use topic analysis to cluster the videos and compare them to the categories labeled by the TED website. Finally, we will apply text classification techniques to predict the topics of new videos.
The rest of the project is organized as follows: Section 2 describes the data and web scraping, Section 3 presents tokenization, Section 4 presents exploratory data analysis, Section 5 presents sentiment analysis, Section 6 performs topic modeling analysis, Section 7 performs embedding analysis, and Section 8 performs supervised analysis. The main results of the study, as well as limitations and potential future research, are presented in Section 9.
To acquire the transcript text from TED talk videos, we implemented a web scraping procedure from TED website, with the following steps:
# go to TED talk
drop_down <- remDr$findElement(using = 'xpath', '//*[@id="menu-button--0"]/div')
remDr$mouseMoveToLocation(webElement=drop_down)
remDr$findElement(using = 'xpath', '//*[@id="option-0--0"]/div[1]')$clickElement()
Sys.sleep(2)
# select language to English
drop_down <- remDr$findElement(using = 'xpath', '//*[@id="languages"]')
remDr$mouseMoveToLocation(webElement=drop_down)
remDr$click(2)
remDr$findElement(using = "xpath", '//*[@id="languages"]/optgroup/option[1]')$clickElement()
Sys.sleep(1)
# select topic (take climate change as an example)
drop_down <- remDr$findElement(using = 'xpath', '//*[@id="topics"]')
remDr$mouseMoveToLocation(webElement=drop_down)
remDr$click(2)
remDr$findElement(using = "xpath", '//*[@id="topics"]/option[3]')$clickElement()
remDr$findElement(using = "xpath", '/html/body/div[4]/div[2]/div/div/div/div[3]/ul[1]/li[3]/a')$clickElement()
# if want to change the capital of title, change the number in: [?]/a of the xpath
topic <- remDr$findElement(using = "partial link text", 'Climate change') ## put topic name here
remDr$mouseMoveToLocation(webElement=topic)
remDr$click(2)
Sys.sleep(1)
# select sort by the most relevant
drop_down <- remDr$findElement(using = 'xpath', '//*[@id="filters-sort"]')
remDr$mouseMoveToLocation(webElement=drop_down)
remDr$click(2)
remDr$findElement(using = "xpath", '//*[@id="filters-sort"]/optgroup/option[2]')$clickElement()
# first crawl all videos' titles on the first page
html_page <- remDr$getPageSource()[[1]]
page <- 3
title <- as.character()
speaker <- as.character()
views_times <- as.character()
page_num <- as.character()
for (i in 1:page) {
page_title <- read_html(html_page) %>%
html_nodes(xpath = "//*[@id='browse-results']/div[1]/div/div/div/div/div[2]/h4[2]/a") %>%
html_text()
page_title <- gsub("\n","", page_title)
# there are 36 videos in one page, we hope 100 videos for each topic, we first scrape 3 pages
page_speaker <- read_html(html_page) %>%
html_nodes(xpath = "//*[@id='browse-results']/div[1]/div/div/div/div/div[2]/h4[1]") %>%
html_text()
page_views_times <- read_html(html_page) %>%
html_nodes(xpath = "//*[@id='browse-results']/div[1]/div/div/div/div/div[2]/div/span/span") %>%
html_text()
page_views_times <- gsub("\n","", page_views_times)
page_page <- rep(i, times=length(page_title))
next_page <- remDr$findElement(using = 'link text', 'Next')
remDr$mouseMoveToLocation(webElement=next_page)
remDr$click(2)
Sys.sleep(5)
Sys.sleep(5)
html_page <- remDr$getPageSource()[[1]]
title <- append(title, page_title)
speaker <- append(speaker, page_speaker)
views_times <- append(views_times, page_views_times)
page_num <- append(page_num, page_page)
}
browse_result <- data.frame()
browse_result <- data.frame(
"page" = page_num,
"title" = title,
"speaker" = speaker,
"views_times" = views_times,
"cate" = "Climate Change")
xpath.#click in each video to capture infos
introduction <- as.character()
likes <- as.character()
tanscript <- as.character()
title_re <- as.character()
n <- length(waitforscrape$title)
for (i in 1:n) {
Sys.sleep(3)
search <- remDr$findElement(using = 'xpath', '//*[@id="filters"]/div[1]/div/div[2]/div[1]/div[1]/div/div[1]/div/input')
search$clickElement()
Sys.sleep(5)
search$clearElement()
search$sendKeysToElement(list(waitforscrape$title[i], key = "enter"))
Sys.sleep(8)
# click in the video
video_page <- remDr$findElement(using = 'xpath', "//*[@id='browse-results']/div[1]/div[1]/div/div/div/div[2]/h4[2]/a")
remDr$mouseMoveToLocation(webElement=video_page)
remDr$click(2)
Sys.sleep(15)
video_title <- waitforscrape$title[i]
# open transcript
drop_down <- remDr$findElement(using = 'xpath', "//*[@id='maincontent']/div/div/div/div/div[2]/div[3]/div[2]/button")
remDr$mouseMoveToLocation(webElement=drop_down)
remDr$click(2)
Sys.sleep(5)
# begin to crawl infos
html_page <- remDr$getPageSource()[[1]]
video_sum <- read_html(html_page) %>%
html_nodes(xpath = "//*[@id='maincontent']/div/div/div/div/div[2]/div[3]/div[1]/div[2]/div/div") %>%
html_text()
video_sum <- video_sum[1]
Sys.sleep(5)
video_likes <- read_html(html_page) %>%
html_nodes(xpath = "//*[@id='maincontent']/div/div/div/div/div[2]/div[1]/div[3]/button[1]/div/div/span") %>% html_text()
Sys.sleep(5)
video_tanscript <- read_html(html_page) %>%
html_nodes(xpath = "//*[@id='maincontent']/div/div/div/aside/div[2]/div[2]/div/div/div[1]") %>%
html_text()
Sys.sleep(5)
remDr$goBack()
Sys.sleep(5)
introduction <- append(introduction, video_sum)
likes <- append(likes,video_likes)
tanscript <- append(tanscript, video_tanscript)
title_re <- append(title_re, video_title)
}
video_info <- data.frame(
"title" = title_re,
"introduction" = introduction,
"likes" = likes,
"tanscript" = tanscript)
Read transcript button to extent the transcript text area.
Then, we begin to scrape the all related information that we might use
in the following analysis.As mentioned above, since during the process of scraping TED data, we
found the for loop of clicking in each video and scraping
text is often interrupted, and some xpaths would fail to
use in the case of different day operations. In this case, we have
adopted the following response methods:
As we obtain the list of videos’ title name first, we use the
list of previously successfully obtained video information before
interrupting to compare with the list of title names to obtain the list
of videos to continue scraping.
We take turns using css,
xpath,link text and
partial link text four approaches to locate the position of
the videos or the button of Read transcript and
Next.
Considering this is a dynamic web crawling, we add
Sys.sleep to each scraping and clicking step, so that the
system can give the website react time.
We then saved the data in .csv format in the data
folder
TED_2 <- video_info
TED <- left_join(title_all, TED_2, by="title")
fwrite(TED, file = here::here("data/TED.csv"))
Finally, we set the closing function at the end in case closing the browser incorrectly would influence future scraping the next time.
remDr$closeServer()
remDr$close()
rm(remDr)
rm(rD)
gc()
After obtaining the transcript text from TED talk videos through web scraping and saving the scraped data in .csv format, we imported the resulting data into our analysis. Specifically, we imported two tables: “TED.csv” and “add_details_1.csv.” The first table contained 330 observations and 11 variables, while the second table contained 310 observations and 2 variables. These tables were stored in a data folder and served as the primary source of data for our analysis.
# Import data
TED <- read_csv(here::here("data/TED.csv")) #330 obs
views_add <- read_csv(here::here("data/add_details_1.csv")) #310 obs
kable(TED[1:10,], caption = "The example of original TED table") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| page.x | title | speaker.x | views_times.x | cate | page.y | speaker.y | views_times.y | introduction | likes | tanscript |
|---|---|---|---|---|---|---|---|---|---|---|
| 1 | How does artificial intelligence learn? | Briana Brownell | Mar 2021 | AI | 1 | Briana Brownell | Mar 2021 |
Today, artificial intelligence helps doctors diagnose patients, pilots fly commercial aircraft, and city planners predict traffic. These AIs are often self-taught, working off a simple set of instructions to create a unique array of rules and strategies. So how exactly does a machine learn? Briana Brownell digs into the three basic ways machines investigate, negotiate, and communicate. [Directed by Champ Panupong Techawongthawon, narrated by Safia Elhillo, music by Ambrose Yu]. S… |
(15K) | Transcript (28 Languages)Bahasa IndonesiaDeutschEnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroPortuguês de PortugalRomânăTiếng ViệtTürkçeΕλληνικάРусскийСрпски, Srpskiעבריתالعربيةفارسىکوردی سۆرانیবাংলাதமிழ்ภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)日本語한국어00:09Today, artificial intelligence helps doctors diagnose patients, pilots fly commercial aircraft, and city planners predict traffic. But no matter what these AIs are doing, the computer scientists who designed them likely don’t know exactly how they’re doing it. This is because artificial intelligence is often self-taught, working off a simple set of instructions to create a unique array of rules and strategies. So how exactly does a machine learn? 00:39There are many different ways to build self-teaching programs. But they all rely on the three basic types of machine learning: unsupervised learning, supervised learning, and reinforcement learning. To see these in action, let’s imagine researchers are trying to pull information from a set of medical data containing thousands of patient profiles. 01:01First up, unsupervised learning. This approach would be ideal for analyzing all the profiles to find general similarities and useful patterns. Maybe certain patients have similar disease presentations, or perhaps a treatment produces specific sets of side effects. This broad pattern-seeking approach can be used to identify similarities between patient profiles and find emerging patterns, all without human guidance. 01:28But let’s imagine doctors are looking for something more specific. These physicians want to create an algorithm for diagnosing a particular condition. They begin by collecting two sets of data— medical images and test results from both healthy patients and those diagnosed with the condition. Then, they input this data into a program designed to identify features shared by the sick patients but not the healthy patients. Based on how frequently it sees certain features, the program will assign values to those features’ diagnostic significance, generating an algorithm for diagnosing future patients. However, unlike unsupervised learning, doctors and computer scientists have an active role in what happens next. Doctors will make the final diagnosis and check the accuracy of the algorithm’s prediction. Then computer scientists can use the updated datasets to adjust the program’s parameters and improve its accuracy. This hands-on approach is called supervised learning. 02:27Now, let’s say these doctors want to design another algorithm to recommend treatment plans. Since these plans will be implemented in stages, and they may change depending on each individual’s response to treatments, the doctors decide to use reinforcement learning. This program uses an iterative approach to gather feedback about which medications, dosages and treatments are most effective. Then, it compares that data against each patient’s profile to create their unique, optimal treatment plan. As the treatments progress and the program receives more feedback, it can constantly update the plan for each patient. None of these three techniques are inherently smarter than any other. While some require more or less human intervention, they all have their own strengths and weaknesses which makes them best suited for certain tasks. However, by using them together, researchers can build complex AI systems, where individual programs can supervise and teach each other. For example, when our unsupervised learning program finds groups of patients that are similar, it could send that data to a connected supervised learning program. That program could then incorporate this information into its predictions. Or perhaps dozens of reinforcement learning programs might simulate potential patient outcomes to collect feedback about different treatment plans. 03:43There are numerous ways to create these machine-learning systems, and perhaps the most promising models are those that mimic the relationship between neurons in the brain. These artificial neural networks can use millions of connections to tackle difficult tasks like image recognition, speech recognition, and even language translation. However, the more self-directed these models become, the harder it is for computer scientists to determine how these self-taught algorithms arrive at their solution. Researchers are already looking at ways to make machine learning more transparent. But as AI becomes more involved in our everyday lives, these enigmatic decisions have increasingly large impacts on our work, health, and safety. So as machines continue learning to investigate, negotiate and communicate, we must also consider how to teach them to teach each other to operate ethically. |
| 1 | The danger of AI is weirder than you think | Janelle Shane | Oct 2019 | AI | 1 | Janelle Shane | Oct 2019 | The danger of artificial intelligence isn’t that it’s going to rebel against us, but that it’s going to do exactly what we ask it to do, says AI researcher Janelle Shane. Sharing the weird, sometimes alarming antics of AI algorithms as they try to solve human problems – like creating new ice cream flavors or recognizing cars on the road – Shane shows why AI doesn’t yet measure up to real brains. | (95K) | Transcript (25 Languages)Bahasa IndonesiaDeutschEnglishEspañolFrançaisHrvatskiItalianoMagyarNederlandsPolskiPortuguês brasileiroPortuguês de PortugalRomânăTürkçeČeštinaРусскийУкраїнськаעבריתالعربيةفارسىภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)한국어00:01So, artificial intelligence is known for disrupting all kinds of industries. What about ice cream? What kind of mind-blowing new flavors could we generate with the power of an advanced artificial intelligence? So I teamed up with a group of coders from Kealing Middle School to find out the answer to this question. They collected over 1,600 existing ice cream flavors, and together, we fed them to an algorithm to see what it would generate. And here are some of the flavors that the AI came up with. 00:40[Pumpkin Trash Break] 00:41(Laughter) 00:43[Peanut Butter Slime] 00:46[Strawberry Cream Disease] 00:48(Laughter) 00:50These flavors are not delicious, as we might have hoped they would be. So the question is: What happened? What went wrong? Is the AI trying to kill us? Or is it trying to do what we asked, and there was a problem? 01:06In movies, when something goes wrong with AI, it’s usually because the AI has decided that it doesn’t want to obey the humans anymore, and it’s got its own goals, thank you very much. In real life, though, the AI that we actually have is not nearly smart enough for that. It has the approximate computing power of an earthworm, or maybe at most a single honeybee, and actually, probably maybe less. Like, we’re constantly learning new things about brains that make it clear how much our AIs don’t measure up to real brains. So today’s AI can do a task like identify a pedestrian in a picture, but it doesn’t have a concept of what the pedestrian is beyond that it’s a collection of lines and textures and things. It doesn’t know what a human actually is. So will today’s AI do what we ask it to do? It will if it can, but it might not do what we actually want. 02:04So let’s say that you were trying to get an AI to take this collection of robot parts and assemble them into some kind of robot to get from Point A to Point B. Now, if you were going to try and solve this problem by writing a traditional-style computer program, you would give the program step-by-step instructions on how to take these parts, how to assemble them into a robot with legs and then how to use those legs to walk to Point B. But when you’re using AI to solve the problem, it goes differently. You don’t tell it how to solve the problem, you just give it the goal, and it has to figure out for itself via trial and error how to reach that goal. And it turns out that the way AI tends to solve this particular problem is by doing this: it assembles itself into a tower and then falls over and lands at Point B. And technically, this solves the problem. Technically, it got to Point B. The danger of AI is not that it’s going to rebel against us, it’s that it’s going to do exactly what we ask it to do. So then the trick of working with AI becomes: How do we set up the problem so that it actually does what we want? 03:14So this little robot here is being controlled by an AI. The AI came up with a design for the robot legs and then figured out how to use them to get past all these obstacles. But when David Ha set up this experiment, he had to set it up with very, very strict limits on how big the AI was allowed to make the legs, because otherwise … 03:43(Laughter) 03:48And technically, it got to the end of that obstacle course. So you see how hard it is to get AI to do something as simple as just walk. 03:57So seeing the AI do this, you may say, OK, no fair, you can’t just be a tall tower and fall over, you have to actually, like, use legs to walk. And it turns out, that doesn’t always work, either. This AI’s job was to move fast. They didn’t tell it that it had to run facing forward or that it couldn’t use its arms. So this is what you get when you train AI to move fast, you get things like somersaulting and silly walks. It’s really common. So is twitching along the floor in a heap. 04:32(Laughter) 04:35So in my opinion, you know what should have been a whole lot weirder is the “Terminator” robots. Hacking “The Matrix” is another thing that AI will do if you give it a chance. So if you train an AI in a simulation, it will learn how to do things like hack into the simulation’s math errors and harvest them for energy. Or it will figure out how to move faster by glitching repeatedly into the floor. When you’re working with AI, it’s less like working with another human and a lot more like working with some kind of weird force of nature. And it’s really easy to accidentally give AI the wrong problem to solve, and often we don’t realize that until something has actually gone wrong. 05:16So here’s an experiment I did, where I wanted the AI to copy paint colors, to invent new paint colors, given the list like the ones here on the left. And here’s what the AI actually came up with. 05:29[Sindis Poop, Turdly, Suffer, Gray Pubic] 05:32(Laughter) 05:39So technically, it did what I asked it to. I thought I was asking it for, like, nice paint color names, but what I was actually asking it to do was just imitate the kinds of letter combinations that it had seen in the original. And I didn’t tell it anything about what words mean, or that there are maybe some words that it should avoid using in these paint colors. So its entire world is the data that I gave it. Like with the ice cream flavors, it doesn’t know about anything else. 06:12So it is through the data that we often accidentally tell AI to do the wrong thing. This is a fish called a tench. And there was a group of researchers who trained an AI to identify this tench in pictures. But then when they asked it what part of the picture it was actually using to identify the fish, here’s what it highlighted. Yes, those are human fingers. Why would it be looking for human fingers if it’s trying to identify a fish? Well, it turns out that the tench is a trophy fish, and so in a lot of pictures that the AI had seen of this fish during training, the fish looked like this. 06:51(Laughter) 06:53And it didn’t know that the fingers aren’t part of the fish. 06:58So you see why it is so hard to design an AI that actually can understand what it’s looking at. And this is why designing the image recognition in self-driving cars is so hard, and why so many self-driving car failures are because the AI got confused. I want to talk about an example from 2016. There was a fatal accident when somebody was using Tesla’s autopilot AI, but instead of using it on the highway like it was designed for, they used it on city streets. And what happened was, a truck drove out in front of the car and the car failed to brake. Now, the AI definitely was trained to recognize trucks in pictures. But what it looks like happened is the AI was trained to recognize trucks on highway driving, where you would expect to see trucks from behind. Trucks on the side is not supposed to happen on a highway, and so when the AI saw this truck, it looks like the AI recognized it as most likely to be a road sign and therefore, safe to drive underneath. 08:04Here’s an AI misstep from a different field. Amazon recently had to give up on a résumé-sorting algorithm that they were working on when they discovered that the algorithm had learned to discriminate against women. What happened is they had trained it on example résumés of people who they had hired in the past. And from these examples, the AI learned to avoid the résumés of people who had gone to women’s colleges or who had the word “women” somewhere in their resume, as in, “women’s soccer team” or “Society of Women Engineers.” The AI didn’t know that it wasn’t supposed to copy this particular thing that it had seen the humans do. And technically, it did what they asked it to do. They just accidentally asked it to do the wrong thing. 08:46And this happens all the time with AI. AI can be really destructive and not know it. So the AIs that recommend new content in Facebook, in YouTube, they’re optimized to increase the number of clicks and views. And unfortunately, one way that they have found of doing this is to recommend the content of conspiracy theories or bigotry. The AIs themselves don’t have any concept of what this content actually is, and they don’t have any concept of what the consequences might be of recommending this content. 09:22So, when we’re working with AI, it’s up to us to avoid problems. And avoiding things going wrong, that may come down to the age-old problem of communication, where we as humans have to learn how to communicate with AI. We have to learn what AI is capable of doing and what it’s not, and to understand that, with its tiny little worm brain, AI doesn’t really understand what we’re trying to ask it to do. So in other words, we have to be prepared to work with AI that’s not the super-competent, all-knowing AI of science fiction. We have to be prepared to work with an AI that’s the one that we actually have in the present day. And present-day AI is plenty weird enough. 10:09Thank you. 10:11(Applause) |
| 1 | The wonderful and terrifying implications of computers that can learn | Jeremy Howard | Dec 2014 | AI | 1 | Jeremy Howard | Dec 2014 | What happens when we teach a computer how to learn? Technologist Jeremy Howard shares some surprising new developments in the fast-moving field of deep learning, a technique that can give computers the ability to learn Chinese, or to recognize objects in photos, or to help think through a medical diagnosis. (One deep learning tool, after watching hours of YouTube, taught itself the concept of “cats.”) Get caught up on a field that will change the way the computers around you behav… | (80K) | Transcript (26 Languages)DeutschEnglishEspañolFrançaisItalianoLietuvių kalbaMagyarNederlandsPolskiPortuguês brasileiroPortuguês de PortugalRomânăSvenskaTiếng ViệtTürkçeΕλληνικάРусскийУкраїнськаעבריתالعربيةفارسىภาษาไทย中文 (简体)中文 (繁體)日本語한국어FootnotesFootnotes00:00It used to be that if you wanted to get a computer to do something new, you would have to program it. Now, programming, for those of you here that haven’t done it yourself, requires laying out in excruciating detail every single step that you want the computer to do in order to achieve your goal. Now, if you want to do something that you don’t know how to do yourself, then this is going to be a great challenge. 00:24footnotefootnoteSo this was the challenge faced by this man, Arthur Samuel. In 1956, he wanted to get this computer to be able to beat him at checkers. How can you write a program, lay out in excruciating detail, how to be better than you at checkers? So he came up with an idea: he had the computer play against itself thousands of times and learn how to play checkers. And indeed it worked, and in fact, by 1962, this computer had beaten the Connecticut state champion. 00:55footnotefootnoteSo Arthur Samuel was the father of machine learning, and I have a great debt to him, because I am a machine learning practitioner. I was the president of Kaggle, a community of over 200,000 machine learning practictioners. Kaggle puts up competitions to try and get them to solve previously unsolved problems, and it’s been successful hundreds of times. So from this vantage point, I was able to find out a lot about what machine learning can do in the past, can do today, and what it could do in the future. Perhaps the first big success of machine learning commercially was Google. Google showed that it is possible to find information by using a computer algorithm, and this algorithm is based on machine learning. Since that time, there have been many commercial successes of machine learning. Companies like Amazon and Netflix use machine learning to suggest products that you might like to buy, movies that you might like to watch. Sometimes, it’s almost creepy. Companies like LinkedIn and Facebook sometimes will tell you about who your friends might be and you have no idea how it did it, and this is because it’s using the power of machine learning. These are algorithms that have learned how to do this from data rather than being programmed by hand. 02:07footnotefootnoteThis is also how IBM was successful in getting Watson to beat the two world champions at “Jeopardy,” answering incredibly subtle and complex questions like this one. [“The ancient ‘Lion of Nimrud’ went missing from this city’s national museum in 2003 (along with a lot of other stuff)”] This is also why we are now able to see the first self-driving cars. If you want to be able to tell the difference between, say, a tree and a pedestrian, well, that’s pretty important. We don’t know how to write those programs by hand, but with machine learning, this is now possible. And in fact, this car has driven over a million miles without any accidents on regular roads. 02:40footnotefootnoteSo we now know that computers can learn, and computers can learn to do things that we actually sometimes don’t know how to do ourselves, or maybe can do them better than us. One of the most amazing examples I’ve seen of machine learning happened on a project that I ran at Kaggle where a team run by a guy called Geoffrey Hinton from the University of Toronto won a competition for automatic drug discovery. Now, what was extraordinary here is not just that they beat all of the algorithms developed by Merck or the international academic community, but nobody on the team had any background in chemistry or biology or life sciences, and they did it in two weeks. How did they do this? They used an extraordinary algorithm called deep learning. So important was this that in fact the success was covered in The New York Times in a front page article a few weeks later. This is Geoffrey Hinton here on the left-hand side. Deep learning is an algorithm inspired by how the human brain works, and as a result it’s an algorithm which has no theoretical limitations on what it can do. The more data you give it and the more computation time you give it, the better it gets. 03:48footnotefootnoteThe New York Times also showed in this article another extraordinary result of deep learning which I’m going to show you now. It shows that computers can listen and understand. 04:00footnotefootnote(Video) Richard Rashid: Now, the last step that I want to be able to take in this process is to actually speak to you in Chinese. Now the key thing there is, we’ve been able to take a large amount of information from many Chinese speakers and produce a text-to-speech system that takes Chinese text and converts it into Chinese language, and then we’ve taken an hour or so of my own voice and we’ve used that to modulate the standard text-to-speech system so that it would sound like me. Again, the result’s not perfect. There are in fact quite a few errors. (In Chinese) (Applause) There’s much work to be done in this area. (In Chinese) (Applause) 05:01Jeremy Howard: Well, that was at a machine learning conference in China. It’s not often, actually, at academic conferences that you do hear spontaneous applause, although of course sometimes at TEDx conferences, feel free. Everything you saw there was happening with deep learning. (Applause) Thank you. The transcription in English was deep learning. The translation to Chinese and the text in the top right, deep learning, and the construction of the voice was deep learning as well. 05:26footnotefootnoteSo deep learning is this extraordinary thing. It’s a single algorithm that can seem to do almost anything, and I discovered that a year earlier, it had also learned to see. In this obscure competition from Germany called the German Traffic Sign Recognition Benchmark, deep learning had learned to recognize traffic signs like this one. Not only could it recognize the traffic signs better than any other algorithm, the leaderboard actually showed it was better than people, about twice as good as people. So by 2011, we had the first example of computers that can see better than people. Since that time, a lot has happened. In 2012, Google announced that they had a deep learning algorithm watch YouTube videos and crunched the data on 16,000 computers for a month, and the computer independently learned about concepts such as people and cats just by watching the videos. This is much like the way that humans learn. Humans don’t learn by being told what they see, but by learning for themselves what these things are. Also in 2012, Geoffrey Hinton, who we saw earlier, won the very popular ImageNet competition, looking to try to figure out from one and a half million images what they’re pictures of. As of 2014, we’re now down to a six percent error rate in image recognition. This is better than people, again. 06:41footnotefootnoteSo machines really are doing an extraordinarily good job of this, and it is now being used in industry. For example, Google announced last year that they had mapped every single location in France in two hours, and the way they did it was that they fed street view images into a deep learning algorithm to recognize and read street numbers. Imagine how long it would have taken before: dozens of people, many years. This is also happening in China. Baidu is kind of the Chinese Google, I guess, and what you see here in the top left is an example of a picture that I uploaded to Baidu’s deep learning system, and underneath you can see that the system has understood what that picture is and found similar images. The similar images actually have similar backgrounds, similar directions of the faces, even some with their tongue out. This is not clearly looking at the text of a web page. All I uploaded was an image. So we now have computers which really understand what they see and can therefore search databases of hundreds of millions of images in real time. 07:46footnotefootnoteSo what does it mean now that computers can see? Well, it’s not just that computers can see. In fact, deep learning has done more than that. Complex, nuanced sentences like this one are now understandable with deep learning algorithms. As you can see here, this Stanford-based system showing the red dot at the top has figured out that this sentence is expressing negative sentiment. Deep learning now in fact is near human performance at understanding what sentences are about and what it is saying about those things. Also, deep learning has been used to read Chinese, again at about native Chinese speaker level. This algorithm developed out of Switzerland by people, none of whom speak or understand any Chinese. As I say, using deep learning is about the best system in the world for this, even compared to native human understanding. 08:36This is a system that we put together at my company which shows putting all this stuff together. These are pictures which have no text attached, and as I’m typing in here sentences, in real time it’s understanding these pictures and figuring out what they’re about and finding pictures that are similar to the text that I’m writing. So you can see, it’s actually understanding my sentences and actually understanding these pictures. I know that you’ve seen something like this on Google, where you can type in things and it will show you pictures, but actually what it’s doing is it’s searching the webpage for the text. This is very different from actually understanding the images. This is something that computers have only been able to do for the first time in the last few months. 09:17footnotefootnoteSo we can see now that computers can not only see but they can also read, and, of course, we’ve shown that they can understand what they hear. Perhaps not surprising now that I’m going to tell you they can write. Here is some text that I generated using a deep learning algorithm yesterday. And here is some text that an algorithm out of Stanford generated. Each of these sentences was generated by a deep learning algorithm to describe each of those pictures. This algorithm before has never seen a man in a black shirt playing a guitar. It’s seen a man before, it’s seen black before, it’s seen a guitar before, but it has independently generated this novel description of this picture. We’re still not quite at human performance here, but we’re close. In tests, humans prefer the computer-generated caption one out of four times. Now this system is now only two weeks old, so probably within the next year, the computer algorithm will be well past human performance at the rate things are going. So computers can also write. 10:16footnotefootnoteSo we put all this together and it leads to very exciting opportunities. For example, in medicine, a team in Boston announced that they had discovered dozens of new clinically relevant features of tumors which help doctors make a prognosis of a cancer. Very similarly, in Stanford, a group there announced that, looking at tissues under magnification, they’ve developed a machine learning-based system which in fact is better than human pathologists at predicting survival rates for cancer sufferers. In both of these cases, not only were the predictions more accurate, but they generated new insightful science. In the radiology case, they were new clinical indicators that humans can understand. In this pathology case, the computer system actually discovered that the cells around the cancer are as important as the cancer cells themselves in making a diagnosis. This is the opposite of what pathologists had been taught for decades. In each of those two cases, they were systems developed by a combination of medical experts and machine learning experts, but as of last year, we’re now beyond that too. This is an example of identifying cancerous areas of human tissue under a microscope. The system being shown here can identify those areas more accurately, or about as accurately, as human pathologists, but was built entirely with deep learning using no medical expertise by people who have no background in the field. Similarly, here, this neuron segmentation. We can now segment neurons about as accurately as humans can, but this system was developed with deep learning using people with no previous background in medicine. 11:56footnotefootnoteSo myself, as somebody with no previous background in medicine, I seem to be entirely well qualified to start a new medical company, which I did. I was kind of terrified of doing it, but the theory seemed to suggest that it ought to be possible to do very useful medicine using just these data analytic techniques. And thankfully, the feedback has been fantastic, not just from the media but from the medical community, who have been very supportive. The theory is that we can take the middle part of the medical process and turn that into data analysis as much as possible, leaving doctors to do what they’re best at. I want to give you an example. It now takes us about 15 minutes to generate a new medical diagnostic test and I’ll show you that in real time now, but I’ve compressed it down to three minutes by cutting some pieces out. Rather than showing you creating a medical diagnostic test, I’m going to show you a diagnostic test of car images, because that’s something we can all understand. 12:54So here we’re starting with about 1.5 million car images, and I want to create something that can split them into the angle of the photo that’s being taken. So these images are entirely unlabeled, so I have to start from scratch. With our deep learning algorithm, it can automatically identify areas of structure in these images. So the nice thing is that the human and the computer can now work together. So the human, as you can see here, is telling the computer about areas of interest which it wants the computer then to try and use to improve its algorithm. Now, these deep learning systems actually are in 16,000-dimensional space, so you can see here the computer rotating this through that space, trying to find new areas of structure. And when it does so successfully, the human who is driving it can then point out the areas that are interesting. So here, the computer has successfully found areas, for example, angles. So as we go through this process, we’re gradually telling the computer more and more about the kinds of structures we’re looking for. You can imagine in a diagnostic test this would be a pathologist identifying areas of pathosis, for example, or a radiologist indicating potentially troublesome nodules. And sometimes it can be difficult for the algorithm. In this case, it got kind of confused. The fronts and the backs of the cars are all mixed up. So here we have to be a bit more careful, manually selecting these fronts as opposed to the backs, then telling the computer that this is a type of group that we’re interested in. 14:21So we do that for a while, we skip over a little bit, and then we train the machine learning algorithm based on these couple of hundred things, and we hope that it’s gotten a lot better. You can see, it’s now started to fade some of these pictures out, showing us that it already is recognizing how to understand some of these itself. We can then use this concept of similar images, and using similar images, you can now see, the computer at this point is able to entirely find just the fronts of cars. So at this point, the human can tell the computer, okay, yes, you’ve done a good job of that. 14:53Sometimes, of course, even at this point it’s still difficult to separate out groups. In this case, even after we let the computer try to rotate this for a while, we still find that the left sides and the right sides pictures are all mixed up together. So we can again give the computer some hints, and we say, okay, try and find a projection that separates out the left sides and the right sides as much as possible using this deep learning algorithm. And giving it that hint – ah, okay, it’s been successful. It’s managed to find a way of thinking about these objects that’s separated out these together. 15:26So you get the idea here. This is a case not where the human is being replaced by a computer, but where they’re working together. What we’re doing here is we’re replacing something that used to take a team of five or six people about seven years and replacing it with something that takes 15 minutes for one person acting alone. 15:50So this process takes about four or five iterations. You can see we now have 62 percent of our 1.5 million images classified correctly. And at this point, we can start to quite quickly grab whole big sections, check through them to make sure that there’s no mistakes. Where there are mistakes, we can let the computer know about them. And using this kind of process for each of the different groups, we are now up to an 80 percent success rate in classifying the 1.5 million images. And at this point, it’s just a case of finding the small number that aren’t classified correctly, and trying to understand why. And using that approach, by 15 minutes we get to 97 percent classification rates. 16:31footnotefootnoteSo this kind of technique could allow us to fix a major problem, which is that there’s a lack of medical expertise in the world. The World Economic Forum says that there’s between a 10x and a 20x shortage of physicians in the developing world, and it would take about 300 years to train enough people to fix that problem. So imagine if we can help enhance their efficiency using these deep learning approaches? 16:56footnotefootnoteSo I’m very excited about the opportunities. I’m also concerned about the problems. The problem here is that every area in blue on this map is somewhere where services are over 80 percent of employment. What are services? These are services. These are also the exact things that computers have just learned how to do. So 80 percent of the world’s employment in the developed world is stuff that computers have just learned how to do. What does that mean? Well, it’ll be fine. They’ll be replaced by other jobs. For example, there will be more jobs for data scientists. Well, not really. It doesn’t take data scientists very long to build these things. For example, these four algorithms were all built by the same guy. So if you think, oh, it’s all happened before, we’ve seen the results in the past of when new things come along and they get replaced by new jobs, what are these new jobs going to be? It’s very hard for us to estimate this, because human performance grows at this gradual rate, but we now have a system, deep learning, that we know actually grows in capability exponentially. And we’re here. So currently, we see the things around us and we say, “Oh, computers are still pretty dumb.” Right? But in five years’ time, computers will be off this chart. So we need to be starting to think about this capability right now. 18:10We have seen this once before, of course. In the Industrial Revolution, we saw a step change in capability thanks to engines. The thing is, though, that after a while, things flattened out. There was social disruption, but once engines were used to generate power in all the situations, things really settled down. The Machine Learning Revolution is going to be very different from the Industrial Revolution, because the Machine Learning Revolution, it never settles down. The better computers get at intellectual activities, the more they can build better computers to be better at intellectual capabilities, so this is going to be a kind of change that the world has actually never experienced before, so your previous understanding of what’s possible is different. 18:50This is already impacting us. In the last 25 years, as capital productivity has increased, labor productivity has been flat, in fact even a little bit down. 19:01footnotefootnoteSo I want us to start having this discussion now. I know that when I often tell people about this situation, people can be quite dismissive. Well, computers can’t really think, they don’t emote, they don’t understand poetry, we don’t really understand how they work. So what? Computers right now can do the things that humans spend most of their time being paid to do, so now’s the time to start thinking about how we’re going to adjust our social structures and economic structures to be aware of this new reality. Thank you. (Applause) |
| 1 | How do we find dignity at work? | Roy Bahat and Bryn Freedman | Feb 2019 | AI | 1 | Roy Bahat and Bryn Freedman | Feb 2019 | Roy Bahat was worried. His company invests in new technology like AI to make businesses more efficient – but, he wondered, what was AI doing to the people whose jobs might change, go away or become less fulfilling? The question sent him on a two-year research odyssey to discover what motivates people, and why we work. In this conversation with curator Bryn Freedman, he shares what he learned, including some surprising insights that will shape the conversation about the future of … | (66K) | Transcript (19 Languages)DeutschEnglishEspañolFrançaisItalianoMagyarPortuguês brasileiroPortuguês de PortugalRomânăTiếng ViệtРусскийУкраїнськаעבריתالعربيةفارسى中文 (简体)中文 (繁體)日本語한국어00:00Bryn Freedman: You’re a guy whose company funds these AI programs and invests. So why should we trust you to not have a bias and tell us something really useful for the rest of us about the future of work? 00:17Roy Bahat: Yes, I am. And when you wake up in the morning and you read the newspaper and it says, “The robots are coming, they may take all our jobs,” as a start-up investor focused on the future of work, our fund was the first one to say artificial intelligence should be a focus for us. 00:32So I woke up one morning and read that and said, “Oh, my gosh, they’re talking about me. That’s me who’s doing that.” And then I thought: wait a minute. If things continue, then maybe not only will the start-ups in which we invest struggle because there won’t be people to have jobs to pay for the things that they make and buy them, but our economy and society might struggle, too. 00:57And look, I should be the guy who sits here and tells you, “Everything is going to be fine. It’s all going to work out great. Hey, when they introduced the ATM machine, years later, there’s more tellers in banks.” It’s true. And yet, when I looked at it, I thought, “This is going to accelerate. And if it does accelerate, there’s a chance the center doesn’t hold.” But I figured somebody must know the answer to this; there are so many ideas out there. And I read all the books, and I went to the conferences, and at one point, we counted more than 100 efforts to study the future of work. And it was a frustrating experience, because I’d hear the same back-and-forth over and over again: “The robots are coming!” And then somebody else would say, “Oh, don’t worry about that, they’ve always said that and it turns out OK.” Then somebody else would say, “Well, it’s really about the meaning of your job, anyway.” And then everybody would shrug and go off and have a drink. And it felt like there was this Kabuki theater of this discussion, where nobody was talking to each other. 01:54And many of the people that I knew and worked with in the technology world were not speaking to policy makers; the policy makers were not speaking to them. And so we partnered with a nonpartisan think tank NGO called New America to study this issue. And we brought together a group of people, including an AI czar at a technology company and a video game designer and a heartland conservative and a Wall Street investor and a socialist magazine editor – literally, all in the same room; it was occasionally awkward – to try to figure out what is it that will happen here. 02:25The question we asked was simple. It was: What is the effect of technology on work going to be? And we looked out 10 to 20 years, because we wanted to look out far enough that there could be real change, but soon enough that we weren’t talking about teleportation or anything like that. And we recognized – and I think every year we’re reminded of this in the world – that predicting what’s going to happen is hard. So instead of predicting, there are other things you can do. You can try to imagine alternate possible futures, which is what we did. We did a scenario-planning exercise, and we imagined cases where no job is safe. We imagined cases where every job is safe. And we imagined every distinct possibility we could. 03:06And the result, which really surprised us, was when you think through those futures and you think what should we do, the answers about what we should do actually turn out to be the same, no matter what happens. And the irony of looking out 10 to 20 years into the future is, you realize that the things we want to act on are actually already happening right now. The automation is right now, the future is right now. 03:30BF: So what does that mean, and what does that tell us? If the future is now, what is it that we should be doing, and what should we be thinking about? 03:37RB: We have to understand the problem first. And so the data are that as the economy becomes more productive and individual workers become more productive, their wages haven’t risen. If you look at the proportion of prime working-age men, in the United States at least, who work now versus in 1960, we have three times as many men not working. And then you hear the stories. 03:59I sat down with a group of Walmart workers and said, “What do you think about this cashier, this futuristic self-checkout thing?” They said, “That’s nice, but have you heard about the cash recycler? That’s a machine that’s being installed right now, and is eliminating two jobs at every Walmart right now.” And so we just thought, “Geez. We don’t understand the problem.” And so we looked at the voices that were the ones that were excluded, which is all of the people affected by this change. And we decided to listen to them, sort of “automation and its discontents.” 04:26And I’ve spent the last couple of years doing that. I’ve been to Flint, Michigan, and Youngstown, Ohio, talking about entrepreneurs, trying to make it work in a very different environment from New York or San Francisco or London or Tokyo. I’ve been to prisons twice to talk to inmates about their jobs after they leave. I’ve sat down with truck drivers to ask them about the self-driving truck, with people who, in addition to their full-time job, care for an aging relative. And when you talk to people, there were two themes that came out loud and clear. 04:56The first one was that people are less looking for more money or get out of the fear of the robot taking their job, and they just want something stable. They want something predictable. So if you survey people and ask them what they want out of work, for everybody who makes less than 150,000 dollars a year, they’ll take a more stable and secure income, on average, over earning more money. And if you think about the fact that not only for all of the people across the earth who don’t earn a living, but for those who do, the vast majority earn a different amount from month to month and have an instability, all of a sudden you realize, “Wait a minute. We have a real problem on our hands.” 05:35And the second thing they say, which took us a longer time to understand, is they say they want dignity. And that concept of self-worth through work emerged again and again and again in our conversations. 05:49BF: So, I certainly appreciate this answer. But you can’t eat dignity, you can’t clothe your children with self-esteem. So, what is that, how do you reconcile – what does dignity mean, and what is the relationship between dignity and stability? 06:06RB: You can’t eat dignity. You need stability first. And the good news is, many of the conversations that are happening right now are about how we solve that. You know, I’m a proponent of studying guaranteed income, as one example, conversations about how health care gets provided and other benefits. Those conversations are happening, and we’re at a time where we must figure that out. It is the crisis of our era. 06:28And my point of view after talking to people is that we may do that, and it still might not be enough. Because what we need to do from the beginning is understand what is it about work that gives people dignity, so they can live the lives that they want to live. And so that concept of dignity is … it’s difficult to get your hands around, because when many people hear it – especially, to be honest, rich people – they hear “meaning.” They hear “My work is important to me.” And again, if you survey people and you ask them, “How important is it to you that your work be important to you?” only people who make 150,000 dollars a year or more say that it is important to them that their work be important. 07:12BF: Meaning, meaningful? 07:13RB: Just defined as, “Is your work important to you?” Whatever somebody took that to mean. And yet, of course dignity is essential. We talked to truck drivers who said, “I saw my cousin drive, and I got on the open road and it was amazing. And I started making more money than people who went to college.” Then they’d get to the end of their thought and say something like, “People need their fruits and vegetables in the morning, and I’m the guy who gets it to them.” 07:38We talked to somebody who, in addition to his job, was caring for his aunt. He was making plenty of money. At one point we just asked, “What is it about caring for your aunt? Can’t you just pay somebody to do it?” He said, “My aunt doesn’t want somebody we pay for. My aunt wants me.” So there was this concept there of being needed. 07:56If you study the word “dignity,” it’s fascinating. It’s one of the oldest words in the English language, from antiquity. And it has two meanings: one is self-worth, and the other is that something is suitable, it’s fitting, meaning that you’re part of something greater than yourself, and it connects to some broader whole. In other words, that you’re needed. 08:15BF: So how do you answer this question, this concept that we don’t pay teachers, and we don’t pay eldercare workers, and we don’t pay people who really care for people and are needed, enough? 08:26RB: Well, the good news is, people are finally asking the question. So as AI investors, we often get phone calls from foundations or CEOs and boardrooms saying, “What do we do about this?” And they used to be asking, “What do we do about introducing automation?” And now they’re asking, “What do we do about self-worth?” And they know that the employees who work for them who have a spouse who cares for somebody, that dignity is essential to their ability to just do their job. 08:50I think there’s two kinds of answers: there’s the money side of just making your life work. That’s stability. You need to eat. And then you think about our culture more broadly, and you ask: Who do we make into heroes? And, you know, what I want is to see the magazine cover that is the person who is the heroic caregiver. Or the Netflix series that dramatizes the person who makes all of our other lives work so we can do the things we do. Let’s make heroes out of those people. That’s the Netflix show that I would binge. 09:20And we’ve had chroniclers of this before – Studs Terkel, the oral history of the working experience in the United States. And what we need is the experience of needing one another and being connected to each other. Maybe that’s the answer for how we all fit as a society. And the thought exercise, to me, is: if you were to go back 100 years and have people – my grandparents, great-grandparents, a tailor, worked in a mine – they look at what all of us do for a living and say, “That’s not work.” We sit there and type and talk, and there’s no danger of getting hurt. And my guess is that if you were to imagine 100 years from now, we’ll still be doing things for each other. We’ll still need one another. And we just will think of it as work. 10:00The entire thing I’m trying to say is that dignity should not just be about having a job. Because if you say you need a job to have dignity, which many people say, the second you say that, you say to all the parents and all the teachers and all the caregivers that all of a sudden, because they’re not being paid for what they’re doing, it somehow lacks this essential human quality. To me, that’s the great puzzle of our time: Can we figure out how to provide that stability throughout life, and then can we figure out how to create an inclusive, not just racially, gender, but multigenerationally inclusive – I mean, every different human experience included – in this way of understanding how we can be needed by one another. 10:41BF: Thank you. RB: Thank you. 10:42BF: Thank you very much for your participation. 10:44(Applause) |
| 1 | The incredible inventions of intuitive AI | Maurice Conti | Feb 2017 | AI | 1 | Maurice Conti | Feb 2017 | What do you get when you give a design tool a digital nervous system? Computers that improve our ability to think and imagine, and robotic systems that come up with (and build) radical new designs for bridges, cars, drones and much more – all by themselves. Take a tour of the Augmented Age with futurist Maurice Conti and preview a time when robots and humans will work side-by-side to accomplish things neither could do alone. | (220K) | Transcript (23 Languages)EnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroRomânăSuomiTiếng ViệtTürkçeČeštinaΕλληνικάРусскийСрпски, SrpskiУкраїнськаעבריתالعربيةفارسى中文 (简体)中文 (繁體)日本語한국어Leslie Gauthier, TranslatorCamille Martínez, Reviewer00:00How many of you are creatives, designers, engineers, entrepreneurs, artists, or maybe you just have a really big imagination? Show of hands? (Cheers) 00:10That’s most of you. I have some news for us creatives. Over the course of the next 20 years, more will change around the way we do our work than has happened in the last 2,000. In fact, I think we’re at the dawn of a new age in human history. 00:33Now, there have been four major historical eras defined by the way we work. The Hunter-Gatherer Age lasted several million years. And then the Agricultural Age lasted several thousand years. The Industrial Age lasted a couple of centuries. And now the Information Age has lasted just a few decades. And now today, we’re on the cusp of our next great era as a species. 01:01Welcome to the Augmented Age. In this new era, your natural human capabilities are going to be augmented by computational systems that help you think, robotic systems that help you make, and a digital nervous system that connects you to the world far beyond your natural senses. Let’s start with cognitive augmentation. How many of you are augmented cyborgs? 01:24(Laughter) 01:26I would actually argue that we’re already augmented. Imagine you’re at a party, and somebody asks you a question that you don’t know the answer to. If you have one of these, in a few seconds, you can know the answer. But this is just a primitive beginning. Even Siri is just a passive tool. In fact, for the last three-and-a-half million years, the tools that we’ve had have been completely passive. They do exactly what we tell them and nothing more. Our very first tool only cut where we struck it. The chisel only carves where the artist points it. And even our most advanced tools do nothing without our explicit direction. In fact, to date, and this is something that frustrates me, we’ve always been limited by this need to manually push our wills into our tools – like, manual, literally using our hands, even with computers. But I’m more like Scotty in “Star Trek.” 02:26(Laughter) 02:28I want to have a conversation with a computer. I want to say, “Computer, let’s design a car,” and the computer shows me a car. And I say, “No, more fast-looking, and less German,” and bang, the computer shows me an option. 02:39(Laughter) 02:42That conversation might be a little ways off, probably less than many of us think, but right now, we’re working on it. Tools are making this leap from being passive to being generative. Generative design tools use a computer and algorithms to synthesize geometry to come up with new designs all by themselves. All it needs are your goals and your constraints. 03:06I’ll give you an example. In the case of this aerial drone chassis, all you would need to do is tell it something like, it has four propellers, you want it to be as lightweight as possible, and you need it to be aerodynamically efficient. Then what the computer does is it explores the entire solution space: every single possibility that solves and meets your criteria – millions of them. It takes big computers to do this. But it comes back to us with designs that we, by ourselves, never could’ve imagined. And the computer’s coming up with this stuff all by itself – no one ever drew anything, and it started completely from scratch. And by the way, it’s no accident that the drone body looks just like the pelvis of a flying squirrel. 03:51(Laughter) 03:54It’s because the algorithms are designed to work the same way evolution does. 03:58What’s exciting is we’re starting to see this technology out in the real world. We’ve been working with Airbus for a couple of years on this concept plane for the future. It’s a ways out still. But just recently we used a generative-design AI to come up with this. This is a 3D-printed cabin partition that’s been designed by a computer. It’s stronger than the original yet half the weight, and it will be flying in the Airbus A320 later this year. So computers can now generate; they can come up with their own solutions to our well-defined problems. But they’re not intuitive. They still have to start from scratch every single time, and that’s because they never learn. Unlike Maggie. 04:44(Laughter) 04:45Maggie’s actually smarter than our most advanced design tools. What do I mean by that? If her owner picks up that leash, Maggie knows with a fair degree of certainty it’s time to go for a walk. And how did she learn? Well, every time the owner picked up the leash, they went for a walk. And Maggie did three things: she had to pay attention, she had to remember what happened and she had to retain and create a pattern in her mind. 05:11Interestingly, that’s exactly what computer scientists have been trying to get AIs to do for the last 60 or so years. Back in 1952, they built this computer that could play Tic-Tac-Toe. Big deal. Then 45 years later, in 1997, Deep Blue beats Kasparov at chess. 2011, Watson beats these two humans at Jeopardy, which is much harder for a computer to play than chess is. In fact, rather than working from predefined recipes, Watson had to use reasoning to overcome his human opponents. And then a couple of weeks ago, DeepMind’s AlphaGo beats the world’s best human at Go, which is the most difficult game that we have. In fact, in Go, there are more possible moves than there are atoms in the universe. So in order to win, what AlphaGo had to do was develop intuition. And in fact, at some points, AlphaGo’s programmers didn’t understand why AlphaGo was doing what it was doing. 06:19And things are moving really fast. I mean, consider – in the space of a human lifetime, computers have gone from a child’s game to what’s recognized as the pinnacle of strategic thought. What’s basically happening is computers are going from being like Spock to being a lot more like Kirk. 06:39(Laughter) 06:43Right? From pure logic to intuition. Would you cross this bridge? Most of you are saying, “Oh, hell no!” 06:52(Laughter) 06:54And you arrived at that decision in a split second. You just sort of knew that bridge was unsafe. And that’s exactly the kind of intuition that our deep-learning systems are starting to develop right now. Very soon, you’ll literally be able to show something you’ve made, you’ve designed, to a computer, and it will look at it and say, “Sorry, homie, that’ll never work. You have to try again.” Or you could ask it if people are going to like your next song, or your next flavor of ice cream. Or, much more importantly, you could work with a computer to solve a problem that we’ve never faced before. For instance, climate change. We’re not doing a very good job on our own, we could certainly use all the help we can get. That’s what I’m talking about, technology amplifying our cognitive abilities so we can imagine and design things that were simply out of our reach as plain old un-augmented humans. 07:47So what about making all of this crazy new stuff that we’re going to invent and design? I think the era of human augmentation is as much about the physical world as it is about the virtual, intellectual realm. How will technology augment us? In the physical world, robotic systems. OK, there’s certainly a fear that robots are going to take jobs away from humans, and that is true in certain sectors. But I’m much more interested in this idea that humans and robots working together are going to augment each other, and start to inhabit a new space. 08:24This is our applied research lab in San Francisco, where one of our areas of focus is advanced robotics, specifically, human-robot collaboration. And this is Bishop, one of our robots. As an experiment, we set it up to help a person working in construction doing repetitive tasks – tasks like cutting out holes for outlets or light switches in drywall. 08:46(Laughter) 08:49So, Bishop’s human partner can tell what to do in plain English and with simple gestures, kind of like talking to a dog, and then Bishop executes on those instructions with perfect precision. We’re using the human for what the human is good at: awareness, perception and decision making. And we’re using the robot for what it’s good at: precision and repetitiveness. 09:10Here’s another cool project that Bishop worked on. The goal of this project, which we called the HIVE, was to prototype the experience of humans, computers and robots all working together to solve a highly complex design problem. The humans acted as labor. They cruised around the construction site, they manipulated the bamboo – which, by the way, because it’s a non-isomorphic material, is super hard for robots to deal with. But then the robots did this fiber winding, which was almost impossible for a human to do. And then we had an AI that was controlling everything. It was telling the humans what to do, telling the robots what to do and keeping track of thousands of individual components. What’s interesting is, building this pavilion was simply not possible without human, robot and AI augmenting each other. 09:57OK, I’ll share one more project. This one’s a little bit crazy. We’re working with Amsterdam-based artist Joris Laarman and his team at MX3D to generatively design and robotically print the world’s first autonomously manufactured bridge. So, Joris and an AI are designing this thing right now, as we speak, in Amsterdam. And when they’re done, we’re going to hit “Go,” and robots will start 3D printing in stainless steel, and then they’re going to keep printing, without human intervention, until the bridge is finished. 10:29So, as computers are going to augment our ability to imagine and design new stuff, robotic systems are going to help us build and make things that we’ve never been able to make before. But what about our ability to sense and control these things? What about a nervous system for the things that we make? 10:48Our nervous system, the human nervous system, tells us everything that’s going on around us. But the nervous system of the things we make is rudimentary at best. For instance, a car doesn’t tell the city’s public works department that it just hit a pothole at the corner of Broadway and Morrison. A building doesn’t tell its designers whether or not the people inside like being there, and the toy manufacturer doesn’t know if a toy is actually being played with – how and where and whether or not it’s any fun. Look, I’m sure that the designers imagined this lifestyle for Barbie when they designed her. 11:22(Laughter) 11:24But what if it turns out that Barbie’s actually really lonely? 11:27(Laughter) 11:31If the designers had known what was really happening in the real world with their designs – the road, the building, Barbie – they could’ve used that knowledge to create an experience that was better for the user. What’s missing is a nervous system connecting us to all of the things that we design, make and use. What if all of you had that kind of information flowing to you from the things you create in the real world? With all of the stuff we make, we spend a tremendous amount of money and energy – in fact, last year, about two trillion dollars – convincing people to buy the things we’ve made. But if you had this connection to the things that you design and create after they’re out in the real world, after they’ve been sold or launched or whatever, we could actually change that, and go from making people want our stuff, to just making stuff that people want in the first place. 12:21The good news is, we’re working on digital nervous systems that connect us to the things we design. We’re working on one project with a couple of guys down in Los Angeles called the Bandito Brothers and their team. And one of the things these guys do is build insane cars that do absolutely insane things. These guys are crazy – 12:44(Laughter) 12:45in the best way. And what we’re doing with them is taking a traditional race-car chassis and giving it a nervous system. 12:54So we instrumented it with dozens of sensors, put a world-class driver behind the wheel, took it out to the desert and drove the hell out of it for a week. And the car’s nervous system captured everything that was happening to the car. We captured four billion data points; all of the forces that it was subjected to. And then we did something crazy. We took all of that data, and plugged it into a generative-design AI we call “Dreamcatcher.” So what do get when you give a design tool a nervous system, and you ask it to build you the ultimate car chassis? You get this. This is something that a human could never have designed. Except a human did design this, but it was a human that was augmented by a generative-design AI, a digital nervous system and robots that can actually fabricate something like this. 13:47So if this is the future, the Augmented Age, and we’re going to be augmented cognitively, physically and perceptually, what will that look like? What is this wonderland going to be like? 14:00I think we’re going to see a world where we’re moving from things that are fabricated to things that are farmed. Where we’re moving from things that are constructed to that which is grown. We’re going to move from being isolated to being connected. And we’ll move away from extraction to embrace aggregation. I also think we’ll shift from craving obedience from our things to valuing autonomy. 14:30Thanks to our augmented capabilities, our world is going to change dramatically. We’re going to have a world with more variety, more connectedness, more dynamism, more complexity, more adaptability and, of course, more beauty. The shape of things to come will be unlike anything we’ve ever seen before. Why? Because what will be shaping those things is this new partnership between technology, nature and humanity. That, to me, is a future well worth looking forward to. 15:03Thank you all so much. 15:04(Applause) |
| 1 | How AI can bring on a second Industrial Revolution | Kevin Kelly | Dec 2016 | AI | 1 | Kevin Kelly | Dec 2016 | “The actual path of a raindrop as it goes down the valley is unpredictable, but the general direction is inevitable,” says digital visionary Kevin Kelly – and technology is much the same, driven by patterns that are surprising but inevitable. Over the next 20 years, he says, our penchant for making things smarter and smarter will have a profound impact on nearly everything we do. Kelly explores three trends in AI we need to understand in order to embrace it and steer its developm… | (55K) | Transcript (24 Languages)EnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroPortuguês de PortugalTiếng ViệtTürkçeČeštinaΕλληνικάРусскийУкраїнськабългарскиעבריתالعربيةفارسىภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)日本語한국어Leslie Gauthier, TranslatorCamille Martínez, Reviewer00:02I’m going to talk a little bit about where technology’s going. And often technology comes to us, we’re surprised by what it brings. But there’s actually a large aspect of technology that’s much more predictable, and that’s because technological systems of all sorts have leanings, they have urgencies, they have tendencies. And those tendencies are derived from the very nature of the physics, chemistry of wires and switches and electrons, and they will make reoccurring patterns again and again. And so those patterns produce these tendencies, these leanings. 00:42You can almost think of it as sort of like gravity. Imagine raindrops falling into a valley. The actual path of a raindrop as it goes down the valley is unpredictable. We cannot see where it’s going, but the general direction is very inevitable: it’s downward. And so these baked-in tendencies and urgencies in technological systems give us a sense of where things are going at the large form. So in a large sense, I would say that telephones were inevitable, but the iPhone was not. The Internet was inevitable, but Twitter was not. 01:21So we have many ongoing tendencies right now, and I think one of the chief among them is this tendency to make things smarter and smarter. I call it cognifying – cognification – also known as artificial intelligence, or AI. And I think that’s going to be one of the most influential developments and trends and directions and drives in our society in the next 20 years. 01:48So, of course, it’s already here. We already have AI, and often it works in the background, in the back offices of hospitals, where it’s used to diagnose X-rays better than a human doctor. It’s in legal offices, where it’s used to go through legal evidence better than a human paralawyer. It’s used to fly the plane that you came here with. Human pilots only flew it seven to eight minutes, the rest of the time the AI was driving. And of course, in Netflix and Amazon, it’s in the background, making those recommendations. That’s what we have today. 02:22And we have an example, of course, in a more front-facing aspect of it, with the win of the AlphaGo, who beat the world’s greatest Go champion. But it’s more than that. If you play a video game, you’re playing against an AI. But recently, Google taught their AI to actually learn how to play video games. Again, teaching video games was already done, but learning how to play a video game is another step. That’s artificial smartness. What we’re doing is taking this artificial smartness and we’re making it smarter and smarter. 03:06There are three aspects to this general trend that I think are underappreciated; I think we would understand AI a lot better if we understood these three things. I think these things also would help us embrace AI, because it’s only by embracing it that we actually can steer it. We can actually steer the specifics by embracing the larger trend. 03:27So let me talk about those three different aspects. The first one is: our own intelligence has a very poor understanding of what intelligence is. We tend to think of intelligence as a single dimension, that it’s kind of like a note that gets louder and louder. It starts like with IQ measurement. It starts with maybe a simple low IQ in a rat or mouse, and maybe there’s more in a chimpanzee, and then maybe there’s more in a stupid person, and then maybe an average person like myself, and then maybe a genius. And this single IQ intelligence is getting greater and greater. That’s completely wrong. That’s not what intelligence is – not what human intelligence is, anyway. It’s much more like a symphony of different notes, and each of these notes is played on a different instrument of cognition. 04:15There are many types of intelligences in our own minds. We have deductive reasoning, we have emotional intelligence, we have spatial intelligence; we have maybe 100 different types that are all grouped together, and they vary in different strengths with different people. And of course, if we go to animals, they also have another basket – another symphony of different kinds of intelligences, and sometimes those same instruments are the same that we have. They can think in the same way, but they may have a different arrangement, and maybe they’re higher in some cases than humans, like long-term memory in a squirrel is actually phenomenal, so it can remember where it buried its nuts. But in other cases they may be lower. 04:58When we go to make machines, we’re going to engineer them in the same way, where we’ll make some of those types of smartness much greater than ours, and many of them won’t be anywhere near ours, because they’re not needed. So we’re going to take these things, these artificial clusters, and we’ll be adding more varieties of artificial cognition to our AIs. We’re going to make them very, very specific. 05:26So your calculator is smarter than you are in arithmetic already; your GPS is smarter than you are in spatial navigation; Google, Bing, are smarter than you are in long-term memory. And we’re going to take, again, these kinds of different types of thinking and we’ll put them into, like, a car. The reason why we want to put them in a car so the car drives, is because it’s not driving like a human. It’s not thinking like us. That’s the whole feature of it. It’s not being distracted, it’s not worrying about whether it left the stove on, or whether it should have majored in finance. It’s just driving. 06:05(Laughter) 06:06Just driving, OK? And we actually might even come to advertise these as “consciousness-free.” They’re without consciousness, they’re not concerned about those things, they’re not distracted. 06:17So in general, what we’re trying to do is make as many different types of thinking as we can. We’re going to populate the space of all the different possible types, or species, of thinking. And there actually may be some problems that are so difficult in business and science that our own type of human thinking may not be able to solve them alone. We may need a two-step program, which is to invent new kinds of thinking that we can work alongside of to solve these really large problems, say, like dark energy or quantum gravity. 06:56What we’re doing is making alien intelligences. You might even think of this as, sort of, artificial aliens in some senses. And they’re going to help us think different, because thinking different is the engine of creation and wealth and new economy. 07:13The second aspect of this is that we are going to use AI to basically make a second Industrial Revolution. The first Industrial Revolution was based on the fact that we invented something I would call artificial power. Previous to that, during the Agricultural Revolution, everything that was made had to be made with human muscle or animal power. That was the only way to get anything done. The great innovation during the Industrial Revolution was, we harnessed steam power, fossil fuels, to make this artificial power that we could use to do anything we wanted to do. So today when you drive down the highway, you are, with a flick of the switch, commanding 250 horses – 250 horsepower – which we can use to build skyscrapers, to build cities, to build roads, to make factories that would churn out lines of chairs or refrigerators way beyond our own power. And that artificial power can also be distributed on wires on a grid to every home, factory, farmstead, and anybody could buy that artificial power, just by plugging something in. 08:27So this was a source of innovation as well, because a farmer could take a manual hand pump, and they could add this artificial power, this electricity, and he’d have an electric pump. And you multiply that by thousands or tens of thousands of times, and that formula was what brought us the Industrial Revolution. All the things that we see, all this progress that we now enjoy, has come from the fact that we’ve done that. 08:50We’re going to do the same thing now with AI. We’re going to distribute that on a grid, and now you can take that electric pump. You can add some artificial intelligence, and now you have a smart pump. And that, multiplied by a million times, is going to be this second Industrial Revolution. So now the car is going down the highway, it’s 250 horsepower, but in addition, it’s 250 minds. That’s the auto-driven car. It’s like a new commodity; it’s a new utility. The AI is going to flow across the grid – the cloud – in the same way electricity did. 09:22So everything that we had electrified, we’re now going to cognify. And I would suggest, then, that the formula for the next 10,000 start-ups is very, very simple, which is to take x and add AI. That is the formula, that’s what we’re going to be doing. And that is the way in which we’re going to make this second Industrial Revolution. And by the way – right now, this minute, you can log on to Google and you can purchase AI for six cents, 100 hits. That’s available right now. 09:54So the third aspect of this is that when we take this AI and embody it, we get robots. And robots are going to be bots, they’re going to be doing many of the tasks that we have already done. A job is just a bunch of tasks, so they’re going to redefine our jobs because they’re going to do some of those tasks. But they’re also going to create whole new categories, a whole new slew of tasks that we didn’t know we wanted to do before. They’re going to actually engender new kinds of jobs, new kinds of tasks that we want done, just as automation made up a whole bunch of new things that we didn’t know we needed before, and now we can’t live without them. So they’re going to produce even more jobs than they take away, but it’s important that a lot of the tasks that we’re going to give them are tasks that can be defined in terms of efficiency or productivity. If you can specify a task, either manual or conceptual, that can be specified in terms of efficiency or productivity, that goes to the bots. Productivity is for robots. What we’re really good at is basically wasting time. 11:04(Laughter) 11:05We’re really good at things that are inefficient. Science is inherently inefficient. It runs on that fact that you have one failure after another. It runs on the fact that you make tests and experiments that don’t work, otherwise you’re not learning. It runs on the fact that there is not a lot of efficiency in it. Innovation by definition is inefficient, because you make prototypes, because you try stuff that fails, that doesn’t work. Exploration is inherently inefficiency. Art is not efficient. Human relationships are not efficient. These are all the kinds of things we’re going to gravitate to, because they’re not efficient. Efficiency is for robots. We’re also going to learn that we’re going to work with these AIs because they think differently than us. 11:50When Deep Blue beat the world’s best chess champion, people thought it was the end of chess. But actually, it turns out that today, the best chess champion in the world is not an AI. And it’s not a human. It’s the team of a human and an AI. The best medical diagnostician is not a doctor, it’s not an AI, it’s the team. We’re going to be working with these AIs, and I think you’ll be paid in the future by how well you work with these bots. So that’s the third thing, is that they’re different, they’re utility and they are going to be something we work with rather than against. We’re working with these rather than against them. 12:30So, the future: Where does that take us? I think that 25 years from now, they’ll look back and look at our understanding of AI and say, “You didn’t have AI. In fact, you didn’t even have the Internet yet, compared to what we’re going to have 25 years from now.” There are no AI experts right now. There’s a lot of money going to it, there are billions of dollars being spent on it; it’s a huge business, but there are no experts, compared to what we’ll know 20 years from now. So we are just at the beginning of the beginning, we’re in the first hour of all this. We’re in the first hour of the Internet. We’re in the first hour of what’s coming. The most popular AI product in 20 years from now, that everybody uses, has not been invented yet. That means that you’re not late. 13:23Thank you. 13:24(Laughter) 13:25(Applause) |
| 1 | How AI can enhance our memory, work and social lives | Tom Gruber | Aug 2017 | AI | 1 | Tom Gruber | Aug 2017 | How smart can our machines make us? Tom Gruber, co-creator of Siri, wants to make “humanistic AI” that augments and collaborates with us instead of competing with (or replacing) us. He shares his vision for a future where AI helps us achieve superhuman performance in perception, creativity and cognitive function – from turbocharging our design skills to helping us remember everything we’ve ever read and the name of everyone we’ve ever met. “We are in the middle of a renaissance i… | (64K) | Transcript (20 Languages)EnglishEspañolFrançaisItalianoMagyarNederlandsO’zbekPortuguês brasileiroTiếng ViệtTürkçeΕλληνικάРусскийУкраїнськаՀայրենالعربيةفارسى中文 (简体)中文 (繁體)日本語한국어FootnotesFootnotes00:01I’m here to offer you a new way to think about my field, artificial intelligence. I think the purpose of AI is to empower humans with machine intelligence. And as machines get smarter, we get smarter. I call this “humanistic AI” – artificial intelligence designed to meet human needs by collaborating and augmenting people. Now, today I’m happy to see that the idea of an intelligent assistant is mainstream. It’s the well-accepted metaphor for the interface between humans and AI. And the one I helped create is called Siri. 00:42footnotefootnoteYou know Siri. Siri is the thing that knows your intent and helps you do it for you, helps you get things done. But what you might not know is that we designed Siri as humanistic AI, to augment people with a conversational interface that made it possible for them to use mobile computing, regardless of who they were and their abilities. 01:06Now for most of us, the impact of this technology is to make things a little bit easier to use. But for my friend Daniel, the impact of the AI in these systems is a life changer. You see, Daniel is a really social guy, and he’s blind and quadriplegic, which makes it hard to use those devices that we all take for granted. The last time I was at his house, his brother said, “Hang on a second, Daniel’s not ready. He’s on the phone with a woman he met online.” I’m like, “That’s cool, how’d he do it?” Well, Daniel uses Siri to manage his own social life – his email, text and phone – without depending on his caregivers. This is kind of interesting, right? The irony here is great. Here’s the man whose relationship with AI helps him have relationships with genuine human beings. And this is humanistic AI. 02:07footnotefootnoteAnother example with life-changing consequences is diagnosing cancer. When a doctor suspects cancer, they take a sample and send it to a pathologist, who looks at it under a microscope. Now, pathologists look at hundreds of slides and millions of cells every day. So to support this task, some researchers made an AI classifier. Now, the classifier says, “Is this cancer or is this not cancer?” looking at the pictures. The classifier was pretty good, but not as good as the person, who got it right most of the time. 02:46But when they combine the ability of the machine and the human together, accuracy went to 99.5 percent. Adding that AI to a partnership eliminated 85 percent of the errors that the human pathologist would have made working alone. That’s a lot of cancer that would have otherwise gone untreated. Now, for the curious, it turns out that the human was better at rejecting false positives, and the machine was better at recognizing those hard-to-spot cases. But the lesson here isn’t about which agent is better at this image-classification task. Those things are changing every day. The lesson here is that by combining the abilities of the human and machine, it created a partnership that had superhuman performance. And that is humanistic AI. 03:42Now let’s look at another example with turbocharging performance. This is design. Now, let’s say you’re an engineer. You want to design a new frame for a drone. You get out your favorite software tools, CAD tools, and you enter the form and the materials, and then you analyze performance. That gives you one design. If you give those same tools to an AI, it can generate thousands of designs. 04:10footnotefootnoteThis video by Autodesk is amazing. This is real stuff. So this transforms how we do design. The human engineer now says what the design should achieve, and the machine says, “Here’s the possibilities.” Now in her job, the engineer’s job is to pick the one that best meets the goals of the design, which she knows as a human better than anyone else, using human judgment and expertise. In this case, the winning form looks kind of like something nature would have designed, minus a few million years of evolution and all that unnecessary fur. 04:48footnotefootnoteNow let’s see where this idea of humanistic AI might lead us if we follow it into the speculative beyond. What’s a kind of augmentation that we would all like to have? Well, how about cognitive enhancement? Instead of asking, “How smart can we make our machines?” let’s ask “How smart can our machines make us?” I mean, take memory for example. Memory is the foundation of human intelligence. But human memory is famously flawed. We’re great at telling stories, but not getting the details right. And our memories – they decay over time. I mean, like, where did the ’60s go, and can I go there, too? 05:34(Laughter) 05:36But what if you could have a memory that was as good as computer memory, and was about your life? What if you could remember every person you ever met, how to pronounce their name, their family details, their favorite sports, the last conversation you had with them? If you had this memory all your life, you could have the AI look at all the interactions you had with people over time and help you reflect on the long arc of your relationships. What if you could have the AI read everything you’ve ever read and listen to every song you’ve ever heard? From the tiniest clue, it could help you retrieve anything you’ve ever seen or heard before. Imagine what that would do for the ability to make new connections and form new ideas. 06:26And what about our bodies? What if we could remember the consequences of every food we eat, every pill we take, every all-nighter we pull? We could do our own science on our own data about what makes us feel good and stay healthy. And imagine how this could revolutionize the way we manage allergies and chronic disease. 06:52I believe that AI will make personal memory enhancement a reality. I can’t say when or what form factors are involved, but I think it’s inevitable, because the very things that make AI successful today – the availability of comprehensive data and the ability for machines to make sense of that data – can be applied to the data of our lives. And those data are here today, available for all of us, because we lead digitally mediated lives, in mobile and online. 07:31In my view, a personal memory is a private memory. We get to choose what is and is not recalled and retained. It’s absolutely essential that this be kept very secure. 07:45Now for most of us, the impact of augmented personal memory will be a more improved mental gain, maybe, hopefully, a bit more social grace. But for the millions who suffer from Alzheimer’s and dementia, the difference that augmented memory could make is a difference between a life of isolation and a life of dignity and connection. 08:12We are in the middle of a renaissance in artificial intelligence right now. I mean, in just the past few years, we’re beginning to see solutions to AI problems that we have struggled with literally for decades: speech understanding, text understanding, image understanding. We have a choice in how we use this powerful technology. We can choose to use AI to automate and compete with us, or we can use AI to augment and collaborate with us, to overcome our cognitive limitations and to help us do what we want to do, only better. And as we discover new ways to give machines intelligence, we can distribute that intelligence to all of the AI assistants in the world, and therefore to every person, regardless of circumstance. And that is why, every time a machine gets smarter, we get smarter. 09:21That is an AI worth spreading. 09:24Thank you. 09:25(Applause) |
| 1 | We’re building a dystopia just to make people click on ads | Zeynep Tufekci | Oct 2017 | AI | 1 | Zeynep Tufekci | Oct 2017 | We’re building an artificial intelligence-powered dystopia, one click at a time, says techno-sociologist Zeynep Tufekci. In an eye-opening talk, she details how the same algorithms companies like Facebook, Google and Amazon use to get you to click on ads are also used to organize your access to political and social information. And the machines aren’t even the real threat. What we need to understand is how the powerful might use AI to control us – and what we can do in response. | (100K) | Transcript (23 Languages)CatalàEnglishEspañolFrançaisHrvatskiItalianoMagyarNederlandsPolskiPortuguês brasileiroPortuguês de PortugalTürkçeΕλληνικάРусскийСрпски, SrpskiУкраїнськаעבריתالعربيةภาษาไทย中文 (简体)中文 (繁體)日本語한국어00:00So when people voice fears of artificial intelligence, very often, they invoke images of humanoid robots run amok. You know? Terminator? You know, that might be something to consider, but that’s a distant threat. Or, we fret about digital surveillance with metaphors from the past. “1984,” George Orwell’s “1984,” it’s hitting the bestseller lists again. It’s a great book, but it’s not the correct dystopia for the 21st century. What we need to fear most is not what artificial intelligence will do to us on its own, but how the people in power will use artificial intelligence to control us and to manipulate us in novel, sometimes hidden, subtle and unexpected ways. Much of the technology that threatens our freedom and our dignity in the near-term future is being developed by companies in the business of capturing and selling our data and our attention to advertisers and others: Facebook, Google, Amazon, Alibaba, Tencent. 01:14Now, artificial intelligence has started bolstering their business as well. And it may seem like artificial intelligence is just the next thing after online ads. It’s not. It’s a jump in category. It’s a whole different world, and it has great potential. It could accelerate our understanding of many areas of study and research. But to paraphrase a famous Hollywood philosopher, “With prodigious potential comes prodigious risk.” 01:49Now let’s look at a basic fact of our digital lives, online ads. Right? We kind of dismiss them. They seem crude, ineffective. We’ve all had the experience of being followed on the web by an ad based on something we searched or read. You know, you look up a pair of boots and for a week, those boots are following you around everywhere you go. Even after you succumb and buy them, they’re still following you around. We’re kind of inured to that kind of basic, cheap manipulation. We roll our eyes and we think, “You know what? These things don’t work.” Except, online, the digital technologies are not just ads. Now, to understand that, let’s think of a physical world example. You know how, at the checkout counters at supermarkets, near the cashier, there’s candy and gum at the eye level of kids? That’s designed to make them whine at their parents just as the parents are about to sort of check out. Now, that’s a persuasion architecture. It’s not nice, but it kind of works. That’s why you see it in every supermarket. Now, in the physical world, such persuasion architectures are kind of limited, because you can only put so many things by the cashier. Right? And the candy and gum, it’s the same for everyone, even though it mostly works only for people who have whiny little humans beside them. In the physical world, we live with those limitations. 03:22In the digital world, though, persuasion architectures can be built at the scale of billions and they can target, infer, understand and be deployed at individuals one by one by figuring out your weaknesses, and they can be sent to everyone’s phone private screen, so it’s not visible to us. And that’s different. And that’s just one of the basic things that artificial intelligence can do. 03:52Now, let’s take an example. Let’s say you want to sell plane tickets to Vegas. Right? So in the old world, you could think of some demographics to target based on experience and what you can guess. You might try to advertise to, oh, men between the ages of 25 and 35, or people who have a high limit on their credit card, or retired couples. Right? That’s what you would do in the past. 04:16With big data and machine learning, that’s not how it works anymore. So to imagine that, think of all the data that Facebook has on you: every status update you ever typed, every Messenger conversation, every place you logged in from, all your photographs that you uploaded there. If you start typing something and change your mind and delete it, Facebook keeps those and analyzes them, too. Increasingly, it tries to match you with your offline data. It also purchases a lot of data from data brokers. It could be everything from your financial records to a good chunk of your browsing history. Right? In the US, such data is routinely collected, collated and sold. In Europe, they have tougher rules. 05:11So what happens then is, by churning through all that data, these machine-learning algorithms – that’s why they’re called learning algorithms – they learn to understand the characteristics of people who purchased tickets to Vegas before. When they learn this from existing data, they also learn how to apply this to new people. So if they’re presented with a new person, they can classify whether that person is likely to buy a ticket to Vegas or not. Fine. You’re thinking, an offer to buy tickets to Vegas. I can ignore that. But the problem isn’t that. The problem is, we no longer really understand how these complex algorithms work. We don’t understand how they’re doing this categorization. It’s giant matrices, thousands of rows and columns, maybe millions of rows and columns, and not the programmers and not anybody who looks at it, even if you have all the data, understands anymore how exactly it’s operating any more than you’d know what I was thinking right now if you were shown a cross section of my brain. It’s like we’re not programming anymore, we’re growing intelligence that we don’t truly understand. 06:40And these things only work if there’s an enormous amount of data, so they also encourage deep surveillance on all of us so that the machine learning algorithms can work. That’s why Facebook wants to collect all the data it can about you. The algorithms work better. 06:56So let’s push that Vegas example a bit. What if the system that we do not understand was picking up that it’s easier to sell Vegas tickets to people who are bipolar and about to enter the manic phase. Such people tend to become overspenders, compulsive gamblers. They could do this, and you’d have no clue that’s what they were picking up on. I gave this example to a bunch of computer scientists once and afterwards, one of them came up to me. He was troubled and he said, “That’s why I couldn’t publish it.” I was like, “Couldn’t publish what?” He had tried to see whether you can indeed figure out the onset of mania from social media posts before clinical symptoms, and it had worked, and it had worked very well, and he had no idea how it worked or what it was picking up on. 07:54Now, the problem isn’t solved if he doesn’t publish it, because there are already companies that are developing this kind of technology, and a lot of the stuff is just off the shelf. This is not very difficult anymore. 08:09Do you ever go on YouTube meaning to watch one video and an hour later you’ve watched 27? You know how YouTube has this column on the right that says, “Up next” and it autoplays something? It’s an algorithm picking what it thinks that you might be interested in and maybe not find on your own. It’s not a human editor. It’s what algorithms do. It picks up on what you have watched and what people like you have watched, and infers that that must be what you’re interested in, what you want more of, and just shows you more. It sounds like a benign and useful feature, except when it isn’t. 08:49So in 2016, I attended rallies of then-candidate Donald Trump to study as a scholar the movement supporting him. I study social movements, so I was studying it, too. And then I wanted to write something about one of his rallies, so I watched it a few times on YouTube. YouTube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism. If I watched one, it served up one even more extreme and autoplayed that one, too. If you watch Hillary Clinton or Bernie Sanders content, YouTube recommends and autoplays conspiracy left, and it goes downhill from there. 09:40Well, you might be thinking, this is politics, but it’s not. This isn’t about politics. This is just the algorithm figuring out human behavior. I once watched a video about vegetarianism on YouTube and YouTube recommended and autoplayed a video about being vegan. It’s like you’re never hardcore enough for YouTube. 10:00(Laughter) 10:02So what’s going on? Now, YouTube’s algorithm is proprietary, but here’s what I think is going on. The algorithm has figured out that if you can entice people into thinking that you can show them something more hardcore, they’re more likely to stay on the site watching video after video going down that rabbit hole while Google serves them ads. Now, with nobody minding the ethics of the store, these sites can profile people who are Jew haters, who think that Jews are parasites and who have such explicit anti-Semitic content, and let you target them with ads. They can also mobilize algorithms to find for you look-alike audiences, people who do not have such explicit anti-Semitic content on their profile but who the algorithm detects may be susceptible to such messages, and lets you target them with ads, too. Now, this may sound like an implausible example, but this is real. ProPublica investigated this and found that you can indeed do this on Facebook, and Facebook helpfully offered up suggestions on how to broaden that audience. BuzzFeed tried it for Google, and very quickly they found, yep, you can do it on Google, too. And it wasn’t even expensive. The ProPublica reporter spent about 30 dollars to target this category. 11:50So last year, Donald Trump’s social media manager disclosed that they were using Facebook dark posts to demobilize people, not to persuade them, but to convince them not to vote at all. And to do that, they targeted specifically, for example, African-American men in key cities like Philadelphia, and I’m going to read exactly what he said. I’m quoting. 12:17They were using “nonpublic posts whose viewership the campaign controls so that only the people we want to see it see it. We modeled this. It will dramatically affect her ability to turn these people out.” 12:33What’s in those dark posts? We have no idea. Facebook won’t tell us. 12:40So Facebook also algorithmically arranges the posts that your friends put on Facebook, or the pages you follow. It doesn’t show you everything chronologically. It puts the order in the way that the algorithm thinks will entice you to stay on the site longer. 12:59Now, so this has a lot of consequences. You may be thinking somebody is snubbing you on Facebook. The algorithm may never be showing your post to them. The algorithm is prioritizing some of them and burying the others. 13:17Experiments show that what the algorithm picks to show you can affect your emotions. But that’s not all. It also affects political behavior. So in 2010, in the midterm elections, Facebook did an experiment on 61 million people in the US that was disclosed after the fact. So some people were shown, “Today is election day,” the simpler one, and some people were shown the one with that tiny tweak with those little thumbnails of your friends who clicked on “I voted.” This simple tweak. OK? So the pictures were the only change, and that post shown just once turned out an additional 340,000 voters in that election, according to this research as confirmed by the voter rolls. A fluke? No. Because in 2012, they repeated the same experiment. And that time, that civic message shown just once turned out an additional 270,000 voters. For reference, the 2016 US presidential election was decided by about 100,000 votes. Now, Facebook can also very easily infer what your politics are, even if you’ve never disclosed them on the site. Right? These algorithms can do that quite easily. What if a platform with that kind of power decides to turn out supporters of one candidate over the other? How would we even know about it? 15:13Now, we started from someplace seemingly innocuous – online adds following us around – and we’ve landed someplace else. As a public and as citizens, we no longer know if we’re seeing the same information or what anybody else is seeing, and without a common basis of information, little by little, public debate is becoming impossible, and we’re just at the beginning stages of this. These algorithms can quite easily infer things like your people’s ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age and genders, just from Facebook likes. These algorithms can identify protesters even if their faces are partially concealed. These algorithms may be able to detect people’s sexual orientation just from their dating profile pictures. 16:21Now, these are probabilistic guesses, so they’re not going to be 100 percent right, but I don’t see the powerful resisting the temptation to use these technologies just because there are some false positives, which will of course create a whole other layer of problems. Imagine what a state can do with the immense amount of data it has on its citizens. China is already using face detection technology to identify and arrest people. And here’s the tragedy: we’re building this infrastructure of surveillance authoritarianism merely to get people to click on ads. And this won’t be Orwell’s authoritarianism. This isn’t “1984.” Now, if authoritarianism is using overt fear to terrorize us, we’ll all be scared, but we’ll know it, we’ll hate it and we’ll resist it. But if the people in power are using these algorithms to quietly watch us, to judge us and to nudge us, to predict and identify the troublemakers and the rebels, to deploy persuasion architectures at scale and to manipulate individuals one by one using their personal, individual weaknesses and vulnerabilities, and if they’re doing it at scale through our private screens so that we don’t even know what our fellow citizens and neighbors are seeing, that authoritarianism will envelop us like a spider’s web and we may not even know we’re in it. 18:10So Facebook’s market capitalization is approaching half a trillion dollars. It’s because it works great as a persuasion architecture. But the structure of that architecture is the same whether you’re selling shoes or whether you’re selling politics. The algorithms do not know the difference. The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows, and that’s what’s got to change. 18:50Now, don’t get me wrong, we use digital platforms because they provide us with great value. I use Facebook to keep in touch with friends and family around the world. I’ve written about how crucial social media is for social movements. I have studied how these technologies can be used to circumvent censorship around the world. But it’s not that the people who run, you know, Facebook or Google are maliciously and deliberately trying to make the country or the world more polarized and encourage extremism. I read the many well-intentioned statements that these people put out. But it’s not the intent or the statements people in technology make that matter, it’s the structures and business models they’re building. And that’s the core of the problem. Either Facebook is a giant con of half a trillion dollars and ads don’t work on the site, it doesn’t work as a persuasion architecture, or its power of influence is of great concern. It’s either one or the other. It’s similar for Google, too. 20:12So what can we do? This needs to change. Now, I can’t offer a simple recipe, because we need to restructure the whole way our digital technology operates. Everything from the way technology is developed to the way the incentives, economic and otherwise, are built into the system. We have to face and try to deal with the lack of transparency created by the proprietary algorithms, the structural challenge of machine learning’s opacity, all this indiscriminate data that’s being collected about us. We have a big task in front of us. We have to mobilize our technology, our creativity and yes, our politics so that we can build artificial intelligence that supports us in our human goals but that is also constrained by our human values. And I understand this won’t be easy. We might not even easily agree on what those terms mean. But if we take seriously how these systems that we depend on for so much operate, I don’t see how we can postpone this conversation anymore. These structures are organizing how we function and they’re controlling what we can and we cannot do. And many of these ad-financed platforms, they boast that they’re free. In this context, it means that we are the product that’s being sold. We need a digital economy where our data and our attention is not for sale to the highest-bidding authoritarian or demagogue. 22:11(Applause) 22:18So to go back to that Hollywood paraphrase, we do want the prodigious potential of artificial intelligence and digital technology to blossom, but for that, we must face this prodigious menace, open-eyed and now. 22:36Thank you. 22:37(Applause) |
| 1 | How AI can save our humanity | Kai-Fu Lee | Aug 2018 | AI | 1 | Kai-Fu Lee | Aug 2018 | AI is massively transforming our world, but there’s one thing it cannot do: love. In a visionary talk, computer scientist Kai-Fu Lee details how the US and China are driving a deep learning revolution – and shares a blueprint for how humans can thrive in the age of AI by harnessing compassion and creativity. “AI is serendipity,” Lee says. “It is here to liberate us from routine jobs, and it is here to remind us what it is that makes us human.” | (123K) | Transcript (23 Languages)DeutschEnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroPortuguês de PortugalTiếng ViệtTürkçeΕλληνικάРусскийעבריתالعربيةفارسىکوردی سۆرانیગુજરાતીภาษาไทย中文 (简体)中文 (繁體)日本語한국어FootnotesFootnotes00:00I’m going to talk about how AI and mankind can coexist, but first, we have to rethink about our human values. So let me first make a confession about my errors in my values. 00:13It was 11 o’clock, December 16, 1991. I was about to become a father for the first time. My wife, Shen-Ling, lay in the hospital bed going through a very difficult 12-hour labor. I sat by her bedside but looked anxiously at my watch, and I knew something that she didn’t. I knew that if in one hour, our child didn’t come, I was going to leave her there and go back to work and make a presentation about AI to my boss, Apple’s CEO. Fortunately, my daughter was born at 11:30 – 00:55(Laughter) 00:57(Applause) 00:59sparing me from doing the unthinkable, and to this day, I am so sorry for letting my work ethic take precedence over love for my family. 01:10(Applause) 01:16My AI talk, however, went off brilliantly. 01:18(Laughter) 01:21Apple loved my work and decided to announce it at TED1992, 26 years ago on this very stage. I thought I had made one of the biggest, most important discoveries in AI, and so did the “Wall Street Journal” on the following day. 01:39footnotefootnoteBut as far as discoveries went, it turned out, I didn’t discover India, or America. Perhaps I discovered a little island off of Portugal. But the AI era of discovery continued, and more scientists poured their souls into it. About 10 years ago, the grand AI discovery was made by three North American scientists, and it’s known as deep learning. 02:05Deep learning is a technology that can take a huge amount of data within one single domain and learn to predict or decide at superhuman accuracy. For example, if we show the deep learning network a massive number of food photos, it can recognize food such as hot dog or no hot dog. 02:26(Applause) 02:29Or if we show it many pictures and videos and sensor data from driving on the highway, it can actually drive a car as well as a human being on the highway. And what if we showed this deep learning network all the speeches made by President Trump? Then this artificially intelligent President Trump, actually the network – 02:55(Laughter) 02:57can – 02:58(Applause) 03:02You like double oxymorons, huh? 03:05(Laughter) 03:09(Applause) 03:15So this network, if given the request to make a speech about AI, he, or it, might say – 03:24(Recording) Donald Trump: It’s a great thing to build a better world with artificial intelligence. 03:29Kai-Fu Lee: And maybe in another language? 03:31DT: (Speaking Chinese) 03:33(Laughter) 03:34KFL: You didn’t know he knew Chinese, did you? 03:38So deep learning has become the core in the era of AI discovery, and that’s led by the US. But we’re now in the era of implementation, where what really matters is execution, product quality, speed and data. And that’s where China comes in. Chinese entrepreneurs, who I fund as a venture capitalist, are incredible workers, amazing work ethic. My example in the delivery room is nothing compared to how hard people work in China. As an example, one startup tried to claim work-life balance: “Come work for us because we are 996.” And what does that mean? It means the work hours of 9am to 9pm, six days a week. That’s contrasted with other startups that do 997. 04:27And the Chinese product quality has consistently gone up in the past decade, and that’s because of a fiercely competitive environment. In Silicon Valley, entrepreneurs compete in a very gentlemanly fashion, sort of like in old wars in which each side took turns to fire at each other. 04:47(Laughter) 04:48But in the Chinese environment, it’s truly a gladiatorial fight to the death. In such a brutal environment, entrepreneurs learn to grow very rapidly, they learn to make their products better at lightning speed, and they learn to hone their business models until they’re impregnable. As a result, great Chinese products like WeChat and Weibo are arguably better than the equivalent American products from Facebook and Twitter. 05:19footnotefootnoteAnd the Chinese market embraces this change and accelerated change and paradigm shifts. As an example, if any of you go to China, you will see it’s almost cashless and credit card-less, because that thing that we all talk about, mobile payment, has become the reality in China. In the last year, 18.8 trillion US dollars were transacted on mobile internet, and that’s because of very robust technologies built behind it. It’s even bigger than the China GDP. And this technology, you can say, how can it be bigger than the GDP? Because it includes all transactions: wholesale, channels, retail, online, offline, going into a shopping mall or going into a farmers market like this. The technology is used by 700 million people to pay each other, not just merchants, so it’s peer to peer, and it’s almost transaction-fee-free. And it’s instantaneous, and it’s used everywhere. And finally, the China market is enormous. This market is large, which helps give entrepreneurs more users, more revenue, more investment, but most importantly, it gives the entrepreneurs a chance to collect a huge amount of data which becomes rocket fuel for the AI engine. So as a result, the Chinese AI companies have leaped ahead so that today, the most valuable companies in computer vision, speech recognition, speech synthesis, machine translation and drones are all Chinese companies. 06:59footnotefootnoteSo with the US leading the era of discovery and China leading the era of implementation, we are now in an amazing age where the dual engine of the two superpowers are working together to drive the fastest revolution in technology that we have ever seen as humans. And this will bring tremendous wealth, unprecedented wealth: 16 trillion dollars, according to PwC, in terms of added GDP to the worldwide GDP by 2030. It will also bring immense challenges in terms of potential job replacements. Whereas in the Industrial Age it created more jobs because craftsman jobs were being decomposed into jobs in the assembly line, so more jobs were created. But AI completely replaces the individual jobs in the assembly line with robots. And it’s not just in factories, but truckers, drivers and even jobs like telesales, customer service and hematologists as well as radiologists over the next 15 years are going to be gradually replaced by artificial intelligence. And only the creative jobs – 08:20(Laughter) 08:22I have to make myself safe, right? Really, the creative jobs are the ones that are protected, because AI can optimize but not create. 08:33But what’s more serious than the loss of jobs is the loss of meaning, because the work ethic in the Industrial Age has brainwashed us into thinking that work is the reason we exist, that work defined the meaning of our lives. And I was a prime and willing victim to that type of workaholic thinking. I worked incredibly hard. That’s why I almost left my wife in the delivery room, that’s why I worked 996 alongside my entrepreneurs. And that obsession that I had with work ended abruptly a few years ago when I was diagnosed with fourth stage lymphoma. The PET scan here shows over 20 malignant tumors jumping out like fireballs, melting away my ambition. But more importantly, it helped me reexamine my life. Knowing that I may only have a few months to live caused me to see how foolish it was for me to base my entire self-worth on how hard I worked and the accomplishments from hard work. My priorities were completely out of order. I neglected my family. My father had passed away, and I never had a chance to tell him I loved him. My mother had dementia and no longer recognized me, and my children had grown up. 10:04During my chemotherapy, I read a book by Bronnie Ware who talked about dying wishes and regrets of the people in the deathbed. She found that facing death, nobody regretted that they didn’t work hard enough in this life. They only regretted that they didn’t spend enough time with their loved ones and that they didn’t spread their love. 10:30So I am fortunately today in remission. 10:34(Applause) 10:41So I can be back at TED again to share with you that I have changed my ways. I now only work 965 – occasionally 996, but usually 965. I moved closer to my mother, my wife usually travels with me, and when my kids have vacation, if they don’t come home, I go to them. So it’s a new form of life that helped me recognize how important it is that love is for me, and facing death helped me change my life, but it also helped me see a new way of how AI should impact mankind and work and coexist with mankind, that really, AI is taking away a lot of routine jobs, but routine jobs are not what we’re about. 11:32Why we exist is love. When we hold our newborn baby, love at first sight, or when we help someone in need, humans are uniquely able to give and receive love, and that’s what differentiates us from AI. 11:48Despite what science fiction may portray, I can responsibly tell you that AI has no love. When AlphaGo defeated the world champion Ke Jie, while Ke Jie was crying and loving the game of go, AlphaGo felt no happiness from winning and certainly no desire to hug a loved one. 12:11So how do we differentiate ourselves as humans in the age of AI? We talked about the axis of creativity, and certainly that is one possibility, and now we introduce a new axis that we can call compassion, love, or empathy. Those are things that AI cannot do. So as AI takes away the routine jobs, I like to think we can, we should and we must create jobs of compassion. You might ask how many of those there are, but I would ask you: Do you not think that we are going to need a lot of social workers to help us make this transition? Do you not think we need a lot of compassionate caregivers to give more medical care to more people? Do you not think we’re going to need 10 times more teachers to help our children find their way to survive and thrive in this brave new world? And with all the newfound wealth, should we not also make labors of love into careers and let elderly accompaniment or homeschooling become careers also? 13:18(Applause) 13:24This graph is surely not perfect, but it points at four ways that we can work with AI. AI will come and take away the routine jobs and in due time, we will be thankful. AI will become great tools for the creatives so that scientists, artists, musicians and writers can be even more creative. AI will work with humans as analytical tools that humans can wrap their warmth around for the high-compassion jobs. And we can always differentiate ourselves with the uniquely capable jobs that are both compassionate and creative, using and leveraging our irreplaceable brains and hearts. So there you have it: a blueprint of coexistence for humans and AI. 14:15AI is serendipity. It is here to liberate us from routine jobs, and it is here to remind us what it is that makes us human. So let us choose to embrace AI and to love one another. 14:28Thank you. 14:29(Applause) |
| 1 | Why fascism is so tempting — and how your data could power it | Yuval Noah Harari | May 2018 | AI | 1 | Yuval Noah Harari | May 2018 | In a profound talk about technology and power, author and historian Yuval Noah Harari explains the important difference between fascism and nationalism – and what the consolidation of our data means for the future of democracy. Appearing as a hologram live from Tel Aviv, Harari warns that the greatest danger that now faces liberal democracy is that the revolution in information technology will make dictatorships more efficient and capable of control. “The enemies of liberal democ… | (139K) | Transcript (31 Languages)DeutschEnglishEspañolFrançaisHrvatskiItalianoLatviešuMagyarNederlandsPolskiPortuguês brasileiroPortuguês de PortugalRomânăSuomiTürkçeΕλληνικάРусскийСрпски, SrpskiУкраїнськабългарскиעבריתالعربيةفارسىکوردی سۆرانیکورمانجیภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)日本語한국어00:00Hello, everyone. It’s a bit funny, because I did write that humans will become digital, but I didn’t think it will happen so fast and that it will happen to me. But here I am, as a digital avatar, and here you are, so let’s start. And let’s start with a question. How many fascists are there in the audience today? 00:26(Laughter) 00:27Well, it’s a bit difficult to say, because we’ve forgotten what fascism is. People now use the term “fascist” as a kind of general-purpose abuse. Or they confuse fascism with nationalism. So let’s take a few minutes to clarify what fascism actually is, and how it is different from nationalism. 00:53The milder forms of nationalism have been among the most benevolent of human creations. Nations are communities of millions of strangers who don’t really know each other. For example, I don’t know the eight million people who share my Israeli citizenship. But thanks to nationalism, we can all care about one another and cooperate effectively. This is very good. Some people, like John Lennon, imagine that without nationalism, the world will be a peaceful paradise. But far more likely, without nationalism, we would have been living in tribal chaos. If you look today at the most prosperous and peaceful countries in the world, countries like Sweden and Switzerland and Japan, you will see that they have a very strong sense of nationalism. In contrast, countries that lack a strong sense of nationalism, like Congo and Somalia and Afghanistan, tend to be violent and poor. 02:04So what is fascism, and how is it different from nationalism? Well, nationalism tells me that my nation is unique, and that I have special obligations towards my nation. Fascism, in contrast, tells me that my nation is supreme, and that I have exclusive obligations towards it. I don’t need to care about anybody or anything other than my nation. Usually, of course, people have many identities and loyalties to different groups. For example, I can be a good patriot, loyal to my country, and at the same time, be loyal to my family, my neighborhood, my profession, humankind as a whole, truth and beauty. Of course, when I have different identities and loyalties, it sometimes creates conflicts and complications. But, well, who ever told you that life was easy? Life is complicated. Deal with it. 03:14Fascism is what happens when people try to ignore the complications and to make life too easy for themselves. Fascism denies all identities except the national identity and insists that I have obligations only towards my nation. If my nation demands that I sacrifice my family, then I will sacrifice my family. If the nation demands that I kill millions of people, then I will kill millions of people. And if my nation demands that I betray truth and beauty, then I should betray truth and beauty. For example, how does a fascist evaluate art? How does a fascist decide whether a movie is a good movie or a bad movie? Well, it’s very, very, very simple. There is really just one yardstick: if the movie serves the interests of the nation, it’s a good movie; if the movie doesn’t serve the interests of the nation, it’s a bad movie. That’s it. Similarly, how does a fascist decide what to teach kids in school? Again, it’s very simple. There is just one yardstick: you teach the kids whatever serves the interests of the nation. The truth doesn’t matter at all. 04:48Now, the horrors of the Second World War and of the Holocaust remind us of the terrible consequences of this way of thinking. But usually, when we talk about the ills of fascism, we do so in an ineffective way, because we tend to depict fascism as a hideous monster, without really explaining what was so seductive about it. It’s a bit like these Hollywood movies that depict the bad guys – Voldemort or Sauron or Darth Vader – as ugly and mean and cruel. They’re cruel even to their own supporters. When I see these movies, I never understand – why would anybody be tempted to follow a disgusting creep like Voldemort? The problem with evil is that in real life, evil doesn’t necessarily look ugly. It can look very beautiful. This is something that Christianity knew very well, which is why in Christian art, as [opposed to] Hollywood, Satan is usually depicted as a gorgeous hunk. This is why it’s so difficult to resist the temptations of Satan, and why it is also difficult to resist the temptations of fascism. 06:10Fascism makes people see themselves as belonging to the most beautiful and most important thing in the world – the nation. And then people think, “Well, they taught us that fascism is ugly. But when I look in the mirror, I see something very beautiful, so I can’t be a fascist, right?” Wrong. That’s the problem with fascism. When you look in the fascist mirror, you see yourself as far more beautiful than you really are. In the 1930s, when Germans looked in the fascist mirror, they saw Germany as the most beautiful thing in the world. If today, Russians look in the fascist mirror, they will see Russia as the most beautiful thing in the world. And if Israelis look in the fascist mirror, they will see Israel as the most beautiful thing in the world. This does not mean that we are now facing a rerun of the 1930s. 07:12Fascism and dictatorships might come back, but they will come back in a new form, a form which is much more relevant to the new technological realities of the 21st century. In ancient times, land was the most important asset in the world. Politics, therefore, was the struggle to control land. And dictatorship meant that all the land was owned by a single ruler or by a small oligarch. And in the modern age, machines became more important than land. Politics became the struggle to control the machines. And dictatorship meant that too many of the machines became concentrated in the hands of the government or of a small elite. Now data is replacing both land and machines as the most important asset. Politics becomes the struggle to control the flows of data. And dictatorship now means that too much data is being concentrated in the hands of the government or of a small elite. 08:28The greatest danger that now faces liberal democracy is that the revolution in information technology will make dictatorships more efficient than democracies. 08:42In the 20th century, democracy and capitalism defeated fascism and communism because democracy was better at processing data and making decisions. Given 20th-century technology, it was simply inefficient to try and concentrate too much data and too much power in one place. 09:07But it is not a law of nature that centralized data processing is always less efficient than distributed data processing. With the rise of artificial intelligence and machine learning, it might become feasible to process enormous amounts of information very efficiently in one place, to take all the decisions in one place, and then centralized data processing will be more efficient than distributed data processing. And then the main handicap of authoritarian regimes in the 20th century – their attempt to concentrate all the information in one place – it will become their greatest advantage. 09:58Another technological danger that threatens the future of democracy is the merger of information technology with biotechnology, which might result in the creation of algorithms that know me better than I know myself. And once you have such algorithms, an external system, like the government, cannot just predict my decisions, it can also manipulate my feelings, my emotions. A dictator may not be able to provide me with good health care, but he will be able to make me love him and to make me hate the opposition. Democracy will find it difficult to survive such a development because, in the end, democracy is not based on human rationality; it’s based on human feelings. During elections and referendums, you’re not being asked, “What do you think?” You’re actually being asked, “How do you feel?” And if somebody can manipulate your emotions effectively, democracy will become an emotional puppet show. 11:18So what can we do to prevent the return of fascism and the rise of new dictatorships? The number one question that we face is: Who controls the data? If you are an engineer, then find ways to prevent too much data from being concentrated in too few hands. And find ways to make sure the distributed data processing is at least as efficient as centralized data processing. This will be the best safeguard for democracy. As for the rest of us who are not engineers, the number one question facing us is how not to allow ourselves to be manipulated by those who control the data. 12:11The enemies of liberal democracy, they have a method. They hack our feelings. Not our emails, not our bank accounts – they hack our feelings of fear and hate and vanity, and then use these feelings to polarize and destroy democracy from within. This is actually a method that Silicon Valley pioneered in order to sell us products. But now, the enemies of democracy are using this very method to sell us fear and hate and vanity. They cannot create these feelings out of nothing. So they get to know our own preexisting weaknesses. And then use them against us. And it is therefore the responsibility of all of us to get to know our weaknesses and make sure that they do not become a weapon in the hands of the enemies of democracy. 13:15Getting to know our own weaknesses will also help us to avoid the trap of the fascist mirror. As we explained earlier, fascism exploits our vanity. It makes us see ourselves as far more beautiful than we really are. This is the seduction. But if you really know yourself, you will not fall for this kind of flattery. If somebody puts a mirror in front of your eyes that hides all your ugly bits and makes you see yourself as far more beautiful and far more important than you really are, just break that mirror. 14:01Thank you. 14:02(Applause) 14:10Chris Anderson: Yuval, thank you. Goodness me. It’s so nice to see you again. So, if I understand you right, you’re alerting us to two big dangers here. One is the possible resurgence of a seductive form of fascism, but close to that, dictatorships that may not exactly be fascistic, but control all the data. I wonder if there’s a third concern that some people here have already expressed, which is where, not governments, but big corporations control all our data. What do you call that, and how worried should we be about that? 14:44Yuval Noah Harari: Well, in the end, there isn’t such a big difference between the corporations and the governments, because, as I said, the questions is: Who controls the data? This is the real government. If you call it a corporation or a government – if it’s a corporation and it really controls the data, this is our real government. So the difference is more apparent than real. 15:06CA: But somehow, at least with corporations, you can imagine market mechanisms where they can be taken down. I mean, if consumers just decide that the company is no longer operating in their interest, it does open the door to another market. It seems easier to imagine that than, say, citizens rising up and taking down a government that is in control of everything. 15:25YNH: Well, we are not there yet, but again, if a corporation really knows you better than you know yourself – at least that it can manipulate your own deepest emotions and desires, and you won’t even realize – you will think this is your authentic self. So in theory, yes, in theory, you can rise against a corporation, just as, in theory, you can rise against a dictatorship. But in practice, it is extremely difficult. 15:55CA: So in “Homo Deus,” you argue that this would be the century when humans kind of became gods, either through development of artificial intelligence or through genetic engineering. Has this prospect of political system shift, collapse impacted your view on that possibility? 16:17YNH: Well, I think it makes it even more likely, and more likely that it will happen faster, because in times of crisis, people are willing to take risks that they wouldn’t otherwise take. And people are willing to try all kinds of high-risk, high-gain technologies. So these kinds of crises might serve the same function as the two world wars in the 20th century. The two world wars greatly accelerated the development of new and dangerous technologies. And the same thing might happen in the 21st century. I mean, you need to be a little crazy to run too fast, let’s say, with genetic engineering. But now you have more and more crazy people in charge of different countries in the world, so the chances are getting higher, not lower. 17:11CA: So, putting it all together, Yuval, you’ve got this unique vision. Roll the clock forward 30 years. What’s your guess – does humanity just somehow scrape through, look back and say, “Wow, that was a close thing. We did it!” Or not? 17:24YNH: So far, we’ve managed to overcome all the previous crises. And especially if you look at liberal democracy and you think things are bad now, just remember how much worse things looked in 1938 or in 1968. So this is really nothing, this is just a small crisis. But you can never know, because, as a historian, I know that you should never underestimate human stupidity. 17:53(Laughter) (Applause) 17:54It is one of the most powerful forces that shape history. 17:59CA: Yuval, it’s been an absolute delight to have you with us. Thank you for making the virtual trip. Have a great evening there in Tel Aviv. Yuval Harari! 18:07YNH: Thank you very much. 18:08(Applause) |
kable(views_add[1:10,], caption = "The example of original add_details_1 table") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| page_title | views_details |
|---|---|
| How does artificial intelligence learn? | 513,440 views | Briana Brownell • TED-Ed |
| The danger of AI is weirder than you think | 3,170,321 views | Janelle Shane • TED2019 |
| The wonderful and terrifying implications of computers that can learn | 2,693,800 views | Jeremy Howard • TEDxBrussels |
| How do we find dignity at work? | 2,204,611 views | Roy Bahat and Bryn Freedman • TED Salon: Zebra Technologies |
| The incredible inventions of intuitive AI | 7,349,317 views | Maurice Conti • TEDxPortland |
| How AI can bring on a second Industrial Revolution | 1,864,143 views | Kevin Kelly • TEDSummit |
| How AI can enhance our memory, work and social lives | 2,166,189 views | Tom Gruber • TED2017 |
| We’re building a dystopia just to make people click on ads | 3,353,192 views | Zeynep Tufekci • TEDGlobal>NYC |
| How AI can save our humanity | 4,115,567 views | Kai-Fu Lee • TED2018 |
| Why fascism is so tempting — and how your data could power it | 4,641,498 views | Yuval Noah Harari • TED2018 |
We then removed any duplicated observations and combined the two tables by matching the “title” column. The resulting table, which we named TED, contained 324 observations and 12 variables. However, for the purpose of our analysis, we were only interested in six specific variables: title of videos (title), posting time (posted), topic of videos (cate), the number of likes for videos (likes), transcript (transcript), and the number of views of videos (views_details). Therefore, we selected these variables and removed the rest, creating a new table called TED_sentiment that would serve as the main table for our sentiment analysis. We also removed the title variable from the TED table, as it would not be used in the sentiment analysis.
After performing these operations, we discovered that the TED table contained 34 missing values (NAs), which we subsequently removed. The resulting TED table contained 286 observations, representing 103 videos on AI, 86 videos on Climate change, and 97 videos on Relationships. This table would serve as the basis for our further analyses.
# Delete duplicate rows of TED
#sum(duplicated(TED)) #6
TED <- TED[!duplicated(TED), ]
# Delete duplicate rows of views_add
#unique(views_add$page_title) #304 so duplicate title = 6
views_add <- views_add[!duplicated(views_add), ]
# Combine tables
TED <- left_join(TED,views_add, by = c("title"="page_title"))
TED <- TED %>% as_tibble(data.frame(TED)) %>%
select(title, views_times.x, cate, likes, tanscript, views_details)
colnames(TED)[2] <- "posted"
# Identify NA and remove
# checkna <- TED[is.na(TED$tanscript), ]
# numtotal <- data.frame(table(TED$cate))
# numna <- data.frame(table(checkna$cate))
# diff <- numtotal %>% left_join(numna, by = "Var1") %>%
# mutate(diff = Freq.x-Freq.y)
TED <- na.omit(TED) #286
catenum <- data.frame(table(TED$cate))
colnames(catenum) <- c("Topics", "Count")
kable(catenum[,], caption = "The number of videos per topics") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| Topics | Count |
|---|---|
| AI | 103 |
| Climate change | 86 |
| Relationships | 97 |
Subsequently, it turned to data parsing step. We converted posting time, the number of likes, the number of views to be in the appropriate format for further analyses. For example, the posting time for the first video, “How does artificial intelligence learn?”, was Mar 2021 in the original TED table. It was converted to be 2021-03-01.
For the transcript, there were the number of translated languages and the details of the translation at the beginning of the transcript every video. In this project, we focus only the actual transcript. Thus, we removed this part out of the transcript. For example, the first sentence of the transcript in the first video,“How does artificial intelligence learn?”, was “Transcript (28 Languages)Bahasa IndonesiaDeutschEnglishEspañolFrançaisItalianoMagyarPolskiPortuguês brasileiroPortuguês de PortugalRomânăTiếng ViệtTürkçeΕλληνικάРусскийСрпски, Srpskiעבריתالعربيةفارسىکوردی سۆرانیবাংলাதமிழ்ภาษาไทยမြန်မာဘာသာ中文 (简体)中文 (繁體)日本語한국어”. We removed this part out of the transcript.
# Parse data
TED$posted <- my(TED$posted)
TED$cate[which(TED$cate=="AI")] <- "1"
TED$cate[which(TED$cate=="Climate change")] <- "2"
TED$cate[which(TED$cate=="Relationships")] <- "3"
TED$likes <- gsub("[(]",'',TED$likes)
TED$likes <- gsub("[)]",'',TED$likes)
x <- substr(TED$likes[1],1,1)
TED$likes <- gsub(x,'',TED$likes)
TED$likes <- gsub("K", "e3", TED$likes)
TED$likes <- gsub("M", "e6", TED$likes)
TED$likes <- as.numeric(TED$likes)
# Clean number of views
# first separate the views detail into two parts (before "views" after "views")
views_time <- as.character()
for (i in 1:length(TED$views_details)) {
views_temp <- TED$views_details[i]
views_temp <- strsplit(views_temp, "views")
views_temp <- views_temp[[1]][1]
views_time <- append(views_time,views_temp)
}
views_time <- gsub(" ","",views_time)
views_time <- gsub(",","",views_time)
TED$views_details <- as.numeric(views_time)
# Clean transcript
TED$tanscript <- gsub("^.+?00:(.*)","\\1",TED$tanscript)
TED$tanscript <- gsub("\r\n"," ",TED$tanscript)
TED$tanscript <- gsub("[[:digit:]]"," ",TED$tanscript)
# extract on version of TED for sentiment part
TED_sentiment <- TED
# no need for title column in the following analysis
TED <- TED %>% select(-title)
In order to facilitate data analysis, we assigned numerical values to the categories of AI, Climate Change, and Relationships in the TED dataset. The variable cate was used to represent these categories, with AI being represented by number 1, Climate Change by number 2, and Relationships by number 3. This allows for easier tracking of the videos in both supervised and unsupervised learning analyses.
Due to the limited number of available videos within the selected categories on the TED website, we were unable to gather a larger dataset for unsupervised and supervised learning analyses. In order to increase the number of observations while still avoiding overfitting and striving for a robust model, we decided to set a window of 20 sentences to be equal to one observation. This was based on the observation that the transcripts for each video typically contained more than 20 sentences.
We split sentences by using the tokenize_sentence() function from the quanteda package and created a new variable, namely sub cate. For example, sub_cate of 1.1 indicates that the observation is from the first transcript in the AI category (The first topic). We then created a text variable to uniquely identify each text, with the format X.Y.Z indicating the Zth segment of 20 sentences in the Yth transcript of the Xth category. By using this approach, the number of observations increases from 286 to 1,471 and we named this data frame as TED_full. To sum up, TED_full consists of 1,471 observations with 7 variables which are posted, cate, like, view, subcate, text, tanscript.
# Increase the number of instances: 20 sentences = 1 instance
TED_full <- TED[0,]
TED_full$subcate <- TED_full$cate #new col but same type
TED_full$text <- TED_full$cate
n_transcript <- length(TED$tanscript)
sub_cate_1 = 0
sub_cate_2 = 0
sub_cate_3 = 0
for (i in 1:n_transcript) {
if (TED$cate[i] == "1") {
sub_cate_1 <- sub_cate_1 + 1
subcat_temp = paste(TED$cate[i],".",as.character(sub_cate_1), sep = "", collapse = "")
}
else if (TED$cate[i] == "2") {
sub_cate_2 <- sub_cate_2 + 1
subcat_temp = paste(TED$cate[i],".",as.character(sub_cate_2), sep = "", collapse = "")
}
else {
sub_cate_3 <- sub_cate_3 + 1
subcat_temp = paste(TED$cate[i],".",as.character(sub_cate_3), sep = "", collapse = "")
}
transcript_i <- TED$tanscript[i]
transcript_i_sentence <- unlist(tokenize_sentence(transcript_i))
n_sen <- length(transcript_i_sentence)
n_group <- ceiling(n_sen/20)
for (j in 1:n_group) {
if (j == n_group) {
sentence_temp <- paste(transcript_i_sentence[((j-1)*20+1):(n_sen)], collapse = " ")
}
else {
sentence_temp <- paste(transcript_i_sentence[((j-1)*20+1):(j*20)], collapse = " ")
}
text_temp = paste(subcat_temp,".",as.character(j), sep = "", collapse = "")
TED_temp <- data.frame(posted = TED$posted[i], cate = TED$cate[i], like = TED$likes[i], view = TED$views_details[i], subcate = subcat_temp, text = text_temp, tanscript = sentence_temp)
TED_full <- rbind(TED_full, TED_temp)
}
}
TED_full$tanscript <- trim_ws(TED_full$tanscript)
# Our final table = TED_full consisting of 1471 instances
kable(TED_full[10,], caption = "The example of TED_full table") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| posted | cate | like | view | subcate | text | tanscript | |
|---|---|---|---|---|---|---|---|
| 10 | 2014-12-01 | 1 | 80000 | 2693800 | 1.3 | 1.3.4 | In fact, deep learning has done more than that. Complex, nuanced sentences like this one are now understandable with deep learning algorithms. As you can see here, this Stanford-based system showing the red dot at the top has figured out that this sentence is expressing negative sentiment. Deep learning now in fact is near human performance at understanding what sentences are about and what it is saying about those things. Also, deep learning has been used to read Chinese, again at about native Chinese speaker level. This algorithm developed out of Switzerland by people, none of whom speak or understand any Chinese. As I say, using deep learning is about the best system in the world for this, even compared to native human understanding. : This is a system that we put together at my company which shows putting all this stuff together. These are pictures which have no text attached, and as I’m typing in here sentences, in real time it’s understanding these pictures and figuring out what they’re about and finding pictures that are similar to the text that I’m writing. So you can see, it’s actually understanding my sentences and actually understanding these pictures. I know that you’ve seen something like this on Google, where you can type in things and it will show you pictures, but actually what it’s doing is it’s searching the webpage for the text. This is very different from actually understanding the images. This is something that computers have only been able to do for the first time in the last few months. : footnotefootnoteSo we can see now that computers can not only see but they can also read, and, of course, we’ve shown that they can understand what they hear. Perhaps not surprising now that I’m going to tell you they can write. Here is some text that I generated using a deep learning algorithm yesterday. And here is some text that an algorithm out of Stanford generated. Each of these sentences was generated by a deep learning algorithm to describe each of those pictures. This algorithm before has never seen a man in a black shirt playing a guitar. It’s seen a man before, it’s seen black before, it’s seen a guitar before, but it has independently generated this novel description of this picture. We’re still not quite at human performance here, but we’re close. In tests, humans prefer the computer-generated caption one out of four times. |
We tokenized our transcript by the quanteda package aiming to receive Document-Term Matrix and TFIDF matrix. In this section, we performed the tokenization twice. First, we tokenized TED, which consists of 286 videos/observations, to gain access into hidden insights each video and to observe the similarity and dissimilarity of each video. Second, we tokenized TED_full, which consists of 1,471 instances, for unsupervised and supervised learning analyses.
We applied corpus() and tokens() functions to the tanscript variable to remove numbers, all characters in the “punctuation”, symbols, and separators. We then removed stop words from the SMART information retrieval system in English (571 words) and also deleted 2 more words, applaud and laughter, that they appear often in our transcript as sound representation. Sound representation in a transcript is one of the translated functionality of TED meant to enable deaf and hard-of-hearing viewers to understand all the non-spoken auditory information. Afterward, we performed lemmatization and named the data frame as TED.tk1.
To obtain the Document-Term Matrix and the TFIDF matrix, we used dfm() and dfm_tfidf() functions, respectively. The first 10 terms and 10 documents (videos) are shown below.
# Quanteda
TED.cp1 <- corpus(TED$tanscript)
#summary(TED.cp1)
TED.tk1 <- tokens(
TED.cp1,
remove_numbers = TRUE,
remove_punct = TRUE,
remove_symbols = TRUE,
remove_separators = TRUE)
TED.tk1 <- TED.tk1 %>%
tokens_tolower() %>%
tokens_remove(c(stopwords(source = "smart"), "applaud", "laughter"))
TED.tk1 <- tokens_replace(
TED.tk1,
pattern = hash_lemmas$token,
replacement = hash_lemmas$lemma)
TED.dfm1 <- dfm(TED.tk1)
kable(TED.dfm1[1:10,1:10], caption = "The example of Document-Term Matrix") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| doc_id | today | artificial | intelligence | help | doctor | diagnose | patient | pilot | fly | commercial |
|---|---|---|---|---|---|---|---|---|---|---|
| text1 | 1 | 3 | 2 | 1 | 6 | 4 | 11 | 1 | 1 | 1 |
| text2 | 0 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| text3 | 1 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 1 |
| text4 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| text5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 |
| text6 | 3 | 12 | 10 | 0 | 2 | 1 | 0 | 1 | 2 | 0 |
| text7 | 3 | 3 | 7 | 4 | 1 | 1 | 0 | 0 | 0 | 0 |
| text8 | 1 | 8 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| text9 | 2 | 2 | 2 | 5 | 0 | 1 | 0 | 0 | 0 | 0 |
| text10 | 3 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
TED.tfidf1 <- dfm_tfidf(TED.dfm1)
kable(TED.tfidf1[1:10,1:10], caption = "The example of TFIDF matrix") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| doc_id | today | artificial | intelligence | help | doctor | diagnose | patient | pilot | fly | commercial |
|---|---|---|---|---|---|---|---|---|---|---|
| text1 | 0.2716746 | 1.8525508 | 1.1980671 | 0.4520447 | 5.0614931 | 4.378553 | 10.61505 | 1.225917 | 0.8129134 | 1.280275 |
| text2 | 0.0000000 | 1.2350339 | 1.1980671 | 0.0000000 | 0.0000000 | 0.000000 | 0.00000 | 0.000000 | 0.0000000 | 0.000000 |
| text3 | 0.2716746 | 0.0000000 | 0.0000000 | 0.0000000 | 1.6871644 | 0.000000 | 0.00000 | 0.000000 | 0.0000000 | 1.280275 |
| text4 | 0.0000000 | 0.6175169 | 0.5990335 | 0.0000000 | 0.0000000 | 0.000000 | 0.00000 | 0.000000 | 0.0000000 | 0.000000 |
| text5 | 0.2716746 | 0.0000000 | 0.0000000 | 0.0000000 | 0.0000000 | 0.000000 | 0.00000 | 0.000000 | 1.6258267 | 0.000000 |
| text6 | 0.8150238 | 7.4102033 | 5.9903354 | 0.0000000 | 1.6871644 | 1.094638 | 0.00000 | 1.225917 | 1.6258267 | 0.000000 |
| text7 | 0.8150238 | 1.8525508 | 4.1932348 | 1.8081786 | 0.8435822 | 1.094638 | 0.00000 | 0.000000 | 0.0000000 | 0.000000 |
| text8 | 0.2716746 | 4.9401355 | 5.9903354 | 0.0000000 | 0.0000000 | 0.000000 | 0.00000 | 0.000000 | 0.0000000 | 0.000000 |
| text9 | 0.5433492 | 1.2350339 | 1.1980671 | 2.2602233 | 0.0000000 | 1.094638 | 0.00000 | 0.000000 | 0.0000000 | 0.000000 |
| text10 | 0.8150238 | 1.2350339 | 1.1980671 | 0.0000000 | 0.0000000 | 0.000000 | 0.00000 | 0.000000 | 0.0000000 | 0.000000 |
Additionally, the frequencies of terms can simply be obtained using textstat_frequency(). The terms are ranked by their frequency (rank = 1 for the most frequent), then plotted versus its rank as shown below to illustrate Zipf’s law. According to the Zipf’s law, the frequency that a word appears is inversely proportional to its rank, we notice that the pattern of the relationships of frequency and rank follows Zipf’s law.
We then present the scatter plot on a log10-log10 scale. Although we expect to see a linear relationship from this plot, we observe some deviation from Zipf’s law on the right hand side of the chart (high ranks/ low frequency words). The reason of the deviation might be that the text we are analyzing is not a representative sample of the language. For example, the text might contain a lot of technical terms. In other words, there are many specific terms.
TED.freq1 <- textstat_frequency(TED.dfm1)
#head(TED.freq1, 10)
zipf_orig<- ggplot(TED.freq1,
aes(x = rank, y = frequency, label = feature)) +
geom_point() +
geom_text_repel() +
ggtitle("The relationship of frequency and rank")
zipf_log <- ggplot(TED.freq1,
aes(x = rank, y = frequency, label = feature)) +
geom_point() +
geom_text_repel() +
scale_x_log10() +
scale_y_log10() +
ggtitle("The relationship of frequency and rank on log10-log10 scale")
(zipf_orig+zipf_log)+
plot_layout(guides = "collect")
In this section, we repeated the same steps of the tokenization as previous section. Afterwards, we stored the table and named it as TED.tk. We also present the first 10 terms and 10 documents of the Document-Term Matrix and the TFIDF matrix shown below.
# Quanteda
TED.cp <- corpus(TED_full$tanscript)
#summary(TED.cp)
TED.tk <- tokens(
TED.cp,
remove_numbers = TRUE,
remove_punct = TRUE,
remove_symbols = TRUE,
remove_separators = TRUE)
TED.tk <- TED.tk %>%
tokens_tolower() %>%
tokens_remove(c(stopwords(source = "smart"), "applaud", "laughter"))
TED.tk <- tokens_replace(
TED.tk,
pattern = hash_lemmas$token,
replacement = hash_lemmas$lemma)
TED.dfm <- dfm(TED.tk)
kable(TED.dfm[1:10,1:10], caption = "The example of Document-Term Matrix") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| doc_id | today | artificial | intelligence | help | doctor | diagnose | patient | pilot | fly | commercial |
|---|---|---|---|---|---|---|---|---|---|---|
| text1 | 1 | 2 | 2 | 1 | 6 | 4 | 8 | 1 | 1 | 1 |
| text2 | 0 | 1 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 |
| text3 | 0 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| text4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| text5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| text6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| text7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
| text8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| text9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| text10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
TED.tfidf <- dfm_tfidf(TED.dfm)
kable(TED.tfidf[1:10,1:10], caption = "The example of TFIDF matrix") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| doc_id | today | artificial | intelligence | help | doctor | diagnose | patient | pilot | fly | commercial |
|---|---|---|---|---|---|---|---|---|---|---|
| text1 | 0.7204546 | 2.169655 | 1.943426 | 1.057023 | 8.380564 | 6.944996 | 11.890971 | 1.91234 | 1.389461 | 1.991521 |
| text2 | 0.0000000 | 1.084827 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 4.459114 | 0.00000 | 0.000000 | 0.000000 |
| text3 | 0.0000000 | 2.169655 | 1.943426 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 |
| text4 | 0.0000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 |
| text5 | 0.0000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 |
| text6 | 0.0000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 |
| text7 | 0.7204546 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 1.991521 |
| text8 | 0.0000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 |
| text9 | 0.0000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 |
| text10 | 0.0000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 |
To demonstrate the Zipf’s law, we again plotted 2 charts which are frequency against its rank and the former chart by applying log10 scale. The results of the 2 charts are similar to the previous ones.
TED.freq <- textstat_frequency(TED.dfm)
#head(TED.freq, 10)
zipf_orig <- ggplot(TED.freq,
aes(x = rank, y = frequency, label = feature)) +
geom_point() +
geom_text_repel() +
ggtitle("The relationships of frequency and rank")
zipf_log <- ggplot(TED.freq,
aes(x = rank, y = frequency, label = feature)) +
geom_point() +
geom_text_repel() +
scale_x_log10() +
scale_y_log10() +
ggtitle("The relationships of frequency and rank on log10-log10 scale")
(zipf_orig+zipf_log)+
plot_layout(guides = "collect")
In this section, we perform initial investigations on TED (transcript each video) to discover pattern and to spot anomalies with the summary statistics and graphical representations. We would like to present the analysis of the word frequency, the comparison of videos in term of lexical diversity, the comparison of videos in term of keyness, and the connections between terms by computing co-occurrence.
TED.freq1 %>%
top_n(20, frequency) %>%
ggplot(aes(
x = reorder(feature, frequency),
y = frequency)) +
geom_bar(stat = "identity") +
coord_flip() + #change x y axis
ylab("Frequency") +
xlab("term") +
ggtitle("The top 20 most frequent terms")
textplot_wordcloud(TED.dfm1)
From the top 20 most frequent terms chart and wordcloud plot, we can see that the top 5 terms, which are “people”, “make”, “thing”, “time”, and “year”, are common terms. Due to 3 different topics, we do not expect to see topic specific terms in the top ranks of the most frequent used terms chart. However, we notice some terms that are related to our topics such as world, “love”, “ai”, and “kind”.
We can associate the different texts with their most frequent terms. For example, text12 with “people”, “em”, and “ca” and “people” was expected since it is a common term, “em” and “ca” look more specific to this text.
Subsequently, we also would like to investigate the highest TF-IDF terms to observe the specific terms per document (video). Therefore, we present the top 10 highest TF-IDF in documents as shown below. For text12, we notice that “people” does not appear in the text12 anymore as expected. This is because the TF-IDF of “people” is very low since it is not specific to any text. On the other hand, em and ca are specific terms for text12. We then have a closer look to the text12 and found the dialog between Elon Musk and Chris Anderson, so em and ca stand for Elon Musk and Chris Anderson, respectively.
TED.tfidf1 %>%
tidy() %>%
top_n(5, count) %>% #may change to top 10
ggplot(aes(x = term, y = count)) +
geom_col() +
coord_flip() +
theme(axis.text.y = element_text(size = 4),
axis.ticks.y = element_blank()) +
facet_wrap(~document, ncol = 2)
To have an overall view of the terms with at least one large TF-IDF, we compute the max of the TF-IDF over all texts, for each term and present the top 20 highest TF-IDF. Regarding the below charts, we see that the terms em and regret have the largest weighted frequency, in the sense that the TF-IDF is large in at least one document. em and regret have TF-IDFs of 206.91226 and 203.06399, respectively.
sort(apply(TED.tfidf1, 2, max), decreasing = TRUE)[1:10]
## em regret ag ca asshole sw gk bee
## 206.91226 203.06399 186.04901 128.87258 122.81830 115.44920 112.99284 107.40575
## dog lp
## 94.13888 81.06008
TED.tfidf1 %>%
tidy() %>%
group_by(term) %>%
summarize(count = max(count)) %>% #use summarize to find max
ungroup() %>%
arrange(desc(count)) %>%
top_n(20, count) %>%
ggplot(aes(x=reorder(term, count),
y = count)) +
geom_bar(stat = "identity") +
coord_flip() +
xlab("Max TF-IDF") +
ylab("term")
We perform lexical diversity analysis by presenting the Type-Token Ratio or TTR. The function textstat_lexdiv() from the quanteda.textstats package is used in the computation.
As we can see from the below chart and tables, the lexical diversity analysis is conclusive for these data. There are some texts which have very high TTR (approximately more than 0.8) meaning that those texts have richness of the vocabulary used in the texts. Then, the TTRs gradually decrease to the lowest TTR (approximately 0.3).
From the below tables, text237 and text131 have the highest richness of vocabulary among the other texts (videos), their TTRs are 0.816 and 0.802, respectively, while text88 has the lowest TTR (0.3306) meaning that text88 has the lowest diversity of the vocabulary used in the text compared to the other texts(videos) in this corpus(sample).
ttr_top <- TED.dfm1 %>% textstat_lexdiv() %>% arrange(desc(TTR)) %>% top_n(10, TTR)
ttr_bottom <- TED.dfm1 %>% textstat_lexdiv() %>% arrange(desc(TTR)) %>% top_n(-10, TTR)
TED.dfm1 %>% textstat_lexdiv() %>%
ggplot(aes(reorder(document, -TTR),
TTR)) +
geom_bar(stat="identity") +
xlab("Text") +
ggtitle("Type-Token Ratio per text")
ttr_top
## document TTR
## 1 text237 0.8160000
## 2 text131 0.8024691
## 3 text103 0.7861111
## 4 text106 0.7833333
## 5 text199 0.7745902
## 6 text82 0.7632509
## 7 text124 0.7577640
## 8 text136 0.7575758
## 9 text128 0.7538462
## 10 text236 0.7337662
ttr_bottom
## document TTR
## 1 text240 0.3982642
## 2 text3 0.3881090
## 3 text44 0.3844857
## 4 text137 0.3824734
## 5 text47 0.3813559
## 6 text271 0.3705882
## 7 text125 0.3585291
## 8 text119 0.3420943
## 9 text12 0.3357307
## 10 text88 0.3306496
We start exploring the links between words by computing the co-occurrences of words in documents. The larger the value the more often two words occur together (in documents). For example, the value of “intelligence” and “artificial” are 1,208 which is quit high compared to the other values below. Hence, we can say that these two words often occur together.
TED.co <- fcm(TED.tk1,
context = "document",
tri = FALSE)
kable(TED.co[1:10,1:10], caption = "The example of co-occurrence matrix") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| doc_id | today | artificial | intelligence | help | doctor | diagnose | patient | pilot | fly | commercial |
|---|---|---|---|---|---|---|---|---|---|---|
| today | 614 | 492 | 616 | 343 | 153 | 77 | 165 | 31 | 179 | 44 |
| artificial | 492 | 356 | 1208 | 175 | 112 | 72 | 131 | 25 | 82 | 19 |
| intelligence | 616 | 1208 | 1259 | 245 | 111 | 92 | 141 | 18 | 125 | 21 |
| help | 343 | 175 | 245 | 111 | 99 | 35 | 78 | 10 | 38 | 10 |
| doctor | 153 | 112 | 111 | 99 | 166 | 58 | 250 | 13 | 35 | 14 |
| diagnose | 77 | 72 | 92 | 35 | 58 | 16 | 96 | 7 | 16 | 6 |
| patient | 165 | 131 | 141 | 78 | 250 | 96 | 221 | 20 | 18 | 16 |
| pilot | 31 | 25 | 18 | 10 | 13 | 7 | 20 | 1 | 37 | 3 |
| fly | 179 | 82 | 125 | 38 | 35 | 16 | 18 | 37 | 116 | 5 |
| commercial | 44 | 19 | 21 | 10 | 14 | 6 | 16 | 3 | 5 | 0 |
To read the co-occurrences matrix comfortably, we restrict the analysis to the terms that have a frequency larger than 500.
#create index = words that have frequency > 500
index <- TED.freq1 %>%
filter(frequency > 500) %>%
data.frame() %>%
select(feature)
#then refer them to co occurance table
x <- TED.co[index$feature, index$feature]
kable(x[1:10,1:10], caption = "The example of co-occurrence matrix after restricted the frequency larger than 500") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| doc_id | people | make | thing | time | year | work | human | world | life | love |
|---|---|---|---|---|---|---|---|---|---|---|
| people | 17777 | 19621 | 24547 | 14684 | 15661 | 14919 | 11851 | 11351 | 8906 | 9195 |
| make | 19621 | 8745 | 18101 | 10778 | 11602 | 10450 | 10281 | 7895 | 5774 | 4632 |
| thing | 24547 | 18101 | 12592 | 12432 | 13259 | 11599 | 10604 | 10002 | 6001 | 5712 |
| time | 14684 | 10778 | 12432 | 4362 | 8549 | 7914 | 6787 | 6023 | 4653 | 4302 |
| year | 15661 | 11602 | 13259 | 8549 | 6205 | 7146 | 6763 | 7316 | 4110 | 4801 |
| work | 14919 | 10450 | 11599 | 7914 | 7146 | 5055 | 7033 | 5247 | 4067 | 3921 |
| human | 11851 | 10281 | 10604 | 6787 | 6763 | 7033 | 7916 | 5219 | 3128 | 2791 |
| world | 11351 | 7895 | 10002 | 6023 | 7316 | 5247 | 5219 | 3652 | 3259 | 3675 |
| life | 8906 | 5774 | 6001 | 4653 | 4110 | 4067 | 3128 | 3259 | 3239 | 3834 |
| love | 9195 | 4632 | 5712 | 4302 | 4801 | 3921 | 2791 | 3675 | 3834 | 7017 |
We then use the igraph library to create a network object and plot it. Although we restricted the frequency of terms larger than 500, it is still difficult to read the igraph. Therefore, we decide that for less than 4500 co-occurences, there is no link (larger than 4500, there is one link).
From the igraph, we observe that make, thing, people are the central terms that co-occurs a lot with the others as they are not specific words to any texts. Again, due to 3 different topics, we do not expect to see the specific words for each topic in this analysis.
For climate, we can see it in the igraph without any links to the other words. This is because there is no co-occurance larger than 4500 however, its frequency is larger than 500.
x[x <= 4500] <- 0
x[x > 4500] <- 1
network <- graph_from_adjacency_matrix(
x,
mode = "undirected",
diag = FALSE)
plot(network,
layout = layout_with_kk)
In this section, we employed two dictionaries, AFINN and NRC, and the Valence-Shifters method to conduct sentiment analysis on the transcripts of each video, which means we don’t split the transcript by every 20 sentences. It might be better to see if the sentiment of video would influence the other features. Here, we have following hypothesis:
# resume cate, better to interpret
TED_sentiment$cate <- gsub("1","AI",TED_sentiment$cate)
TED_sentiment$cate <- gsub("2","Climate change",TED_sentiment$cate)
TED_sentiment$cate <- gsub("3","Relationships",TED_sentiment$cate)
# since sentiment analysis cannot use cleaned data after stemming, so here use another way to tokenize again
TED.tok <- unnest_tokens(
TED_sentiment,
output = "word",
input = "tanscript",
to_lower = TRUE,
strip_punct = TRUE,
strip_numeric = TRUE)
TED.tok <- TED.tok %>% filter(word != "laughter" | word != "applaud")
First, we applied the NRC method to determine the sentiment of the transcripts of each video. As this method is based on sentiment analysis, we focused on examining the relationship between the topics of the videos, as well as the number of likes received by the videos.
# NRC
# join the corresponding sentiment qualifier in “nrc”
TED.sent.nrc <-
inner_join(
TED.tok,
get_sentiments("nrc"),
by = c("word" = "word"))
head(TED.sent.nrc, 5) %>% flextable() %>% autofit()
title | posted | cate | likes | views_details | word | sentiment |
How does artificial intelligence learn? | 2021-03-01 | AI | 15,000 | 513,440 | intelligence | fear |
How does artificial intelligence learn? | 2021-03-01 | AI | 15,000 | 513,440 | intelligence | joy |
How does artificial intelligence learn? | 2021-03-01 | AI | 15,000 | 513,440 | intelligence | positive |
How does artificial intelligence learn? | 2021-03-01 | AI | 15,000 | 513,440 | intelligence | trust |
How does artificial intelligence learn? | 2021-03-01 | AI | 15,000 | 513,440 | predict | anticipation |
Since there are near to 300 transcript (videos), we would like to extract 20 videos with the most likes and the least likes, respectively.
In this part, we applied the NRC method in two different ways, one without scaling and another with re-scaling the sentiment by their length in the documents.
# Sub data for checking Video likes topic
TED.nrc <- TED.sent.nrc %>%
group_by(title,cate,likes,sentiment) %>% summarise(n=n())
# too many text, hard to read
# extract top 20, tail 20 transcipt to check their sentiment
toplike20 <- TED.nrc[order(TED.nrc$likes,decreasing = T),][1:200,]
taillike20 <- TED.nrc[order(TED.nrc$likes,decreasing = F),][1:200,]
# top
toplike20%>%
ggplot(mapping = aes(x = sentiment, y=n, fill = sentiment)) +
geom_bar(stat = "identity",
alpha = 0.8) +
facet_wrap(~ title) +
coord_flip()+
theme(legend.position = 'bottom')+
labs(y="the number of sentiment")+
ggtitle("The sentiment of 20 videos with most likes")
# tail
taillike20%>%
ggplot(mapping = aes(x = sentiment, y=n, fill = sentiment)) +
geom_bar(stat = "identity",
alpha = 0.8) +
facet_wrap(~ title) +
coord_flip()+
theme(legend.position = 'bottom')+
labs(y="the number of sentiment")+
ggtitle("The sentiment of 20 videos with least likes")
Re-scale sentiment by their length:
# the frequencies of sentiments are computed, by document
TED.sent.nrc.total <- TED.sent.nrc %>%
group_by(title,likes) %>%
summarize(Total = n()) %>%
ungroup()
#top
left_join(
TED.sent.nrc,
TED.sent.nrc.total)%>%
filter(title %in% toplike20$title) %>%
group_by(title, sentiment) %>%
summarize(n = n(),
Total = unique(Total)) %>%
ungroup() %>%
mutate(relfreq = n / Total) %>%
ggplot(aes(
x = sentiment,
y = relfreq,
fill = sentiment)) +
geom_bar(stat = "identity", alpha = 0.8) +
facet_wrap(~ title) +
coord_flip()+
theme(legend.position = 'bottom')+
labs(y="the number of sentiment")+
ggtitle("The sentiment of 20 videos with most likes")
#tail
left_join(
TED.sent.nrc,
TED.sent.nrc.total)%>%
filter(title %in% taillike20$title) %>%
group_by(title, sentiment) %>%
summarize(n = n(),
Total = unique(Total)) %>%
ungroup() %>%
mutate(relfreq = n / Total) %>%
ggplot(aes(
x = sentiment,
y = relfreq,
fill = sentiment)) +
geom_bar(stat = "identity", alpha = 0.8) +
facet_wrap(~ title) +
coord_flip()+
theme(legend.position = 'bottom')+
labs(y="the number of sentiment")+
ggtitle("The sentiment of 20 videos with least likes")
We did not observe any significant differences in the distribution of sentiments, such as positive and anticipation, between videos with high and low numbers of likes. In fact, both positive and anticipation sentiments were present across all videos, with some videos in the top 20 also exhibiting relatively high levels of negative and fear sentiments.
In order to examine the frequency of sentiment in different topics, We can then compare the results to determine which sentiments are more prevalent in each topic. For example, if we were to analyze the Climate Change topic, we might expect to see a higher frequency of negative or fear-related sentiments, due to the potentially catastrophic consequences of climate change. On the other hand, if we were to analyze the AI topic, we might expect to see a higher frequency of anticipation or positive sentiments, as AI has the potential to bring about many benefits and advancements.
# it is hard to check the sentiment for each video, then check it for each cate
TED.nrc %>%
group_by(cate,sentiment) %>%
summarise(cate_n = sum(n)) %>%
ggplot(mapping = aes(subgroup = cate, fill = interaction(sentiment, cate), area = cate_n)) +
geom_treemap(color="white", size=0.5*.pt, alpha=NA) +
geom_treemap_subgroup_text(
place = "center", alpha = 0.5, grow = TRUE) +
geom_treemap_text(mapping = aes(
label = sentiment),
color = "white",
place = "center", grow = FALSE) +
guides(fill = FALSE)
As we assume, The topic of AI is often accompanied by positive and anticipation, and we could not ignore trust. Yet, we could see that negative also accounts for a not small part. Contrary to our speculation, the topic of climate change has the same positive sentiment which is also the most frequent part in this topic. And, each sentiment is more evenly distributed in the videos on the topic of relationships, even though the positive sentiment is still the most.
In this case, we begin to assume that positive sentiment actually is the main sentiment in TED talk showing in all videos, based on previous analysis.
Besides the initial assumption, we would like to check one more assumption whether the positive sentiment appears in all videos, by using the value-based method: Afinn.
# Afinn
TED.sent.afinn <-
inner_join(
TED.tok,
get_sentiments("afinn"),
by = c("word" = "word"))
TED.sent.afinn %>%
group_by(title,cate) %>%
summarize(Score = mean(value)) %>%
ungroup() %>%
ggplot(aes(x = reorder_within(title, Score,cate), y = Score, fill = cate)) +
geom_bar(stat = "identity") +
coord_flip() +
ylab("Mean Sentiment Score") +
xlab("")
Here, we calculated the average sentiment score per video. We can see that the number of videos transcript with positive and negative values is very disparate. Thus, TED talk do prefer giving positive videos. And, from the topics perspective, we can see that in every level of sentiment values, there are three kinds of videos.
#extract the top and tail videos
video_sentiscore <- TED.sent.afinn %>%
group_by(title) %>%
summarize(Score = mean(value))
TED.sent.afinn.like <- TED.sent.afinn %>%
group_by(title,cate,likes) %>%
summarize(Score = mean(value))
TED.sent.afinn.like %>% ggplot(aes(x=likes,y=Score,color = cate))+
geom_point(size=1) +
geom_smooth(method = "lm")+
facet_wrap(~cate,scales = 'free')+
theme(legend.position = 'bottom')
Since the number of likes for each video is relatively similar, and only some videos have a large number of likes, we separate each category to observe the distribution of the number of likes and sentiment values. We can observe there are no obvious pattern as well. Compared to AI and Relationship, the sentiment of Climate change talks are widely distributed. No matter in terms of which level of likes, there are videos with various sentiment value.
TED.sent.afinn.cate <- TED.sent.afinn %>%
group_by(title,cate) %>%
summarize(Score = mean(value))
TED.sent.afinn.cate %>%
ggplot(mapping = aes(x = cate, y = Score))+
geom_boxplot()+
labs(x="Topics")
The sentiment values for each topic are relatively similar, and they are all in the upper-middle range - more positive. Among the topics of climate change and AI, the sentiment values of each video are more evenly distributed. AI topic has two outliers with lowest values, the most negative.
TED.sent.afinn.year <- TED.sent.afinn
TED.sent.afinn.year <- TED.sent.afinn.year %>%
group_by(title,cate,posted) %>%
summarize(Score = mean(value))
TED.sent.afinn.year %>% ggplot(aes(x=posted,y=Score))+
geom_point()+
geom_smooth(method = "lm")+
theme(legend.position = 'bottom')+
labs(x="Posted Year")+
facet_wrap(~cate)+
scale_x_date(date_minor_breaks = "2 day")
The talks have huge fluctuations over years. Overall, the trend of sentiment value is slightly decreasing, but there are no clear correlation between sentiment value and year.
In this section, we would like to check if the results would change after using valence-shifters.
## split by sentences
TED_sentiment_text <- get_sentences(TED_sentiment$tanscript)
## Compute the sentiment by sentences
TED.senti <- sentiment(TED_sentiment_text)
## Prepare a tibble for the plot
TED.senti <- as_tibble(TED.senti)
TED.sentdoc <- sentiment_by(TED_sentiment$tanscript)
TED.sentdoc %>%
mutate(Document = factor(paste("Doc_", element_id, sep = ""))) %>%
ggplot(aes(x = reorder(Document, ave_sentiment),
y = ave_sentiment)) +
geom_bar(stat="identity") +
coord_flip() +
xlab("") +
ylab("Average Sentiment Score")
#check the difference between afinn method and Valence-Shifters
#check the number of doc with score < 0
nagative_VF <- sum((TED.sentdoc$ave_sentiment <0) == T)
negative <- TED.sent.afinn %>%
group_by(title) %>%
summarize(Score = mean(value)) %>%
filter(Score < 0 )
nagative <- length(negative$title)
# [1] 19
# [1] 31
First, we can see that the sentiment values are distributed as
similar as the one without using Valence-Shifters. After counting the
number of videos transcripts with negative values, we found there are
31 videos having negative values before taking negative
form into account, and 19 videos having negative values
after considering negative form.
Topic modeling is a method for discovering the latent themes or topics that exist within a collection of documents. Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are two popular techniques for topic modeling.
First, we build the LSA object and use 4 dimensions. Latent Semantic Analysis(LSA) decomposes this DTM (TED.dfm) into 3 matrices (\(M = U\Sigma V^{t}\)), centred around 4 topics. We check the 3 matrices: U:Doc-topic sim, Σ:Topic strength and V:Terms-topic sim.
The Doc-topic sim table below shows the link between each text and each topic. For example, text1 most relevant to dimension 2.
TED.lsa <- textmodel_lsa(x = TED.dfm,nd = 4)
kable(TED.lsa$docs,
col.names = c("dimension1","dimension2","dimension3","dimension4"),
caption = "Doc-topic sim.(LSA on TF)") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| dimension1 | dimension2 | dimension3 | dimension4 | |
|---|---|---|---|---|
| text1 | -0.0279833 | 0.0561772 | -0.0062328 | 0.0010815 |
| text2 | -0.0215096 | 0.0463497 | -0.0014294 | -0.0103448 |
| text3 | -0.0307021 | 0.0859480 | 0.0117433 | -0.0083037 |
| text4 | -0.0380724 | 0.1135957 | 0.0104629 | -0.0307621 |
| text5 | -0.0356207 | 0.0885008 | 0.0106824 | -0.0660644 |
| text6 | -0.0246441 | 0.0817765 | 0.0108066 | -0.0532954 |
| text7 | -0.0338506 | 0.0583223 | 0.0036653 | 0.0059932 |
| text8 | -0.0370382 | 0.0411531 | 0.0103976 | 0.0116836 |
| text9 | -0.0447897 | 0.0265595 | 0.0185700 | -0.0080117 |
| text10 | -0.0338763 | 0.0454285 | 0.0153195 | 0.0024713 |
| text11 | -0.0423486 | 0.0567818 | -0.0033229 | -0.0016736 |
| text12 | -0.0379895 | 0.0574517 | 0.0136700 | 0.0113588 |
| text13 | -0.0288484 | 0.0045382 | -0.0153662 | -0.0019589 |
| text14 | -0.0406856 | 0.0217827 | 0.0084525 | 0.0156910 |
| text15 | -0.0002734 | -0.0002192 | 0.0000961 | 0.0000464 |
| text16 | -0.0365886 | 0.0122418 | 0.0176984 | 0.0030070 |
| text17 | -0.0311208 | -0.0028264 | -0.0026220 | 0.0048595 |
| text18 | -0.0492976 | -0.0351071 | 0.0427234 | -0.0208907 |
| text19 | -0.0363120 | -0.0152421 | 0.0234045 | -0.0237751 |
| text20 | -0.0339046 | -0.0038624 | 0.0173571 | -0.0082206 |
| text21 | -0.0208069 | 0.0077222 | -0.0139165 | 0.0000852 |
| text22 | -0.0222323 | 0.0278802 | -0.0051485 | 0.0079200 |
| text23 | -0.0297813 | 0.0321456 | -0.0013286 | 0.0138956 |
| text24 | -0.0374560 | 0.0496412 | 0.0096167 | 0.0257162 |
| text25 | -0.0557391 | 0.0888191 | 0.0219965 | 0.0710129 |
| text26 | -0.0521989 | 0.0306777 | -0.0003609 | 0.0141619 |
| text27 | -0.0042544 | 0.0008642 | 0.0006344 | 0.0036206 |
| text28 | -0.0251029 | 0.0411651 | -0.0072890 | -0.0124747 |
| text29 | -0.0344050 | 0.0589457 | 0.0114062 | -0.0139354 |
| text30 | -0.0369580 | 0.0318083 | -0.0254685 | 0.0025975 |
| text31 | -0.0226044 | 0.0485518 | 0.0056026 | 0.0161761 |
| text32 | -0.0356187 | 0.0534748 | 0.0104151 | -0.0168302 |
| text33 | -0.0080838 | 0.0072980 | -0.0092405 | -0.0085102 |
| text34 | -0.0352532 | 0.0656681 | 0.0181894 | -0.0454033 |
| text35 | -0.0242353 | 0.0542360 | -0.0041722 | -0.0215638 |
| text36 | -0.0330215 | 0.0461389 | 0.0104417 | -0.0272344 |
| text37 | -0.0308500 | 0.0598555 | -0.0063604 | -0.0430018 |
| text38 | -0.0179821 | 0.0097366 | 0.0049577 | 0.0050480 |
| text39 | -0.0238330 | 0.0061130 | 0.0053042 | 0.0052143 |
| text40 | -0.0274934 | 0.0193347 | 0.0062537 | -0.0169289 |
| text41 | -0.0315560 | 0.0155617 | 0.0168134 | -0.0053682 |
| text42 | -0.0234046 | -0.0124838 | 0.0248677 | -0.0152512 |
| text43 | -0.0179985 | -0.0099123 | 0.0128111 | 0.0014105 |
| text44 | -0.0261786 | -0.0016035 | -0.0003247 | -0.0083328 |
| text45 | -0.0287634 | 0.0188651 | -0.0088042 | -0.0123455 |
| text46 | -0.0029347 | 0.0044119 | -0.0022362 | -0.0012734 |
| text47 | -0.0552940 | 0.0733096 | 0.0024355 | -0.0509485 |
| text48 | -0.0358880 | 0.0440969 | -0.0276464 | -0.0150295 |
| text49 | -0.0521970 | 0.0006805 | 0.0569793 | -0.0663707 |
| text50 | -0.0317971 | 0.0670473 | 0.0153916 | -0.0577716 |
| text51 | -0.0206433 | -0.0126664 | 0.0078986 | -0.0133199 |
| text52 | -0.0205440 | -0.0224562 | 0.0134115 | -0.0112309 |
| text53 | -0.0246325 | -0.0093468 | 0.0051426 | 0.0010782 |
| text54 | -0.0372347 | 0.0492480 | -0.0246981 | -0.0280275 |
| text55 | -0.0179237 | 0.0050522 | -0.0064212 | -0.0046819 |
| text56 | -0.0340575 | -0.0007286 | 0.0000155 | 0.0004383 |
| text57 | -0.0002800 | -0.0002177 | 0.0000933 | 0.0000443 |
| text58 | -0.0387312 | 0.0139075 | 0.0124944 | -0.0249154 |
| text59 | -0.0187707 | -0.0058363 | 0.0048127 | -0.0145163 |
| text60 | -0.0304345 | -0.0145236 | -0.0096096 | 0.0052979 |
| text61 | -0.0274175 | -0.0111679 | -0.0359774 | 0.0146917 |
| text62 | -0.0336390 | -0.0307688 | -0.0397434 | 0.0019181 |
| text63 | -0.0365880 | -0.0182064 | -0.0274250 | 0.0199141 |
| text64 | -0.0367709 | 0.0240360 | 0.0088370 | -0.0023384 |
| text65 | -0.0369276 | 0.0098701 | -0.0100740 | 0.0136045 |
| text66 | -0.0380612 | -0.0003095 | -0.0075114 | -0.0003216 |
| text67 | -0.0438540 | 0.0388025 | 0.0346646 | 0.1135046 |
| text68 | -0.0436554 | -0.0029038 | 0.0384375 | 0.0593777 |
| text69 | -0.0432863 | 0.0089356 | -0.0104822 | 0.0353299 |
| text70 | -0.0230926 | 0.0254432 | 0.0042603 | 0.0043933 |
| text71 | -0.0363061 | 0.0065540 | 0.0182818 | -0.0013600 |
| text72 | -0.0253895 | 0.0186895 | 0.0178868 | -0.0008391 |
| text73 | -0.0222579 | -0.0043708 | 0.0067340 | 0.0023973 |
| text74 | -0.0225799 | -0.0086746 | -0.0271082 | 0.0152771 |
| text75 | -0.0365370 | -0.0004380 | -0.0128821 | 0.0025697 |
| text76 | -0.0390514 | -0.0201863 | 0.0121366 | -0.0127377 |
| text77 | -0.0483404 | -0.0368092 | 0.0285262 | -0.0121317 |
| text78 | -0.0220029 | -0.0159129 | -0.0014702 | 0.0035078 |
| text79 | -0.0380605 | -0.0097169 | -0.0031074 | 0.0115252 |
| text80 | -0.0370130 | -0.0109523 | -0.0137154 | 0.0037824 |
| text81 | -0.0307334 | -0.0165525 | -0.0143823 | -0.0040249 |
| text82 | -0.0296924 | -0.0104191 | 0.0110662 | -0.0204233 |
| text83 | -0.0123499 | -0.0042439 | -0.0082059 | -0.0008252 |
| text84 | -0.0273602 | 0.0017589 | 0.0222625 | 0.0105088 |
| text85 | -0.0333002 | -0.0073377 | 0.0280700 | -0.0012174 |
| text86 | -0.0405971 | 0.0108488 | -0.0011507 | -0.0049024 |
| text87 | -0.0261048 | 0.0297196 | -0.0006283 | -0.0087251 |
| text88 | -0.0017894 | -0.0008902 | 0.0018614 | 0.0020646 |
| text89 | -0.0238139 | 0.0268947 | 0.0072796 | -0.0062685 |
| text90 | -0.0213756 | -0.0179840 | 0.0233519 | -0.0104660 |
| text91 | -0.0276659 | 0.0208409 | 0.0040739 | 0.0038001 |
| text92 | -0.0223419 | -0.0000700 | 0.0036568 | 0.0113741 |
| text93 | -0.0079573 | 0.0009448 | 0.0024572 | 0.0020406 |
| text94 | -0.0136609 | -0.0002756 | 0.0067663 | 0.0062770 |
| text95 | -0.0188602 | -0.0134104 | 0.0149755 | 0.0026671 |
| text96 | -0.0158844 | 0.0192038 | 0.0161773 | -0.0091170 |
| text97 | -0.0303079 | 0.0653626 | 0.0169528 | -0.0161569 |
| text98 | -0.0306567 | 0.0496087 | 0.0213858 | -0.0139761 |
| text99 | -0.0349690 | 0.0859240 | 0.0122481 | -0.0123064 |
| text100 | -0.0101329 | 0.0205964 | 0.0005916 | 0.0030593 |
| text101 | -0.0252831 | 0.0151071 | 0.0049409 | 0.0474391 |
| text102 | -0.0288951 | 0.0422386 | 0.0167159 | 0.0648452 |
| text103 | -0.0215067 | 0.0316238 | 0.0151059 | 0.0815688 |
| text104 | -0.0334646 | 0.0232907 | -0.0029621 | 0.0044123 |
| text105 | -0.0552199 | 0.1785669 | -0.0013696 | -0.1143071 |
| text106 | -0.0334138 | 0.0186723 | -0.0051097 | -0.0132088 |
| text107 | -0.0456812 | 0.0938689 | 0.0025930 | -0.0819867 |
| text108 | -0.0404497 | 0.0628996 | 0.0006520 | -0.0458758 |
| text109 | -0.0032467 | -0.0001189 | 0.0007714 | -0.0007521 |
| text110 | -0.0451234 | -0.0521649 | 0.0408087 | -0.0342268 |
| text111 | -0.0429856 | 0.0069827 | -0.0050359 | 0.0193886 |
| text112 | -0.0284270 | 0.0068459 | 0.0068902 | 0.0108043 |
| text113 | -0.0250141 | 0.0191274 | -0.0083502 | -0.0039554 |
| text114 | -0.0324672 | -0.0039064 | 0.0106203 | 0.0090367 |
| text115 | -0.0229685 | 0.0157744 | -0.0107678 | -0.0027180 |
| text116 | -0.0124274 | 0.0093274 | -0.0024347 | -0.0083142 |
| text117 | -0.0112999 | -0.0077356 | 0.0044358 | 0.0024031 |
| text118 | -0.0299285 | -0.0020076 | 0.0090506 | -0.0032034 |
| text119 | -0.0228151 | -0.0041759 | 0.0058362 | 0.0050531 |
| text120 | -0.0152173 | -0.0029941 | 0.0077489 | 0.0028596 |
| text121 | -0.0504794 | 0.0488037 | 0.0010442 | -0.0118251 |
| text122 | -0.0326946 | 0.0046746 | 0.0202754 | -0.0318908 |
| text123 | -0.0237463 | 0.0374208 | -0.0036084 | -0.0113747 |
| text124 | -0.0447069 | 0.0468236 | -0.0142640 | -0.0138476 |
| text125 | -0.0327403 | 0.0260297 | -0.0150693 | 0.0005384 |
| text126 | -0.0080820 | 0.0142708 | 0.0028892 | -0.0054574 |
| text127 | -0.0262351 | 0.0126524 | -0.0179349 | 0.0124041 |
| text128 | -0.0080480 | 0.0012154 | -0.0069604 | 0.0074640 |
| text129 | -0.0399340 | 0.0284301 | -0.0148747 | 0.0304464 |
| text130 | -0.0225688 | 0.0309246 | -0.0064541 | 0.0038491 |
| text131 | -0.0278286 | 0.0400140 | -0.0096108 | 0.0108925 |
| text132 | -0.0335176 | -0.0117386 | 0.0068298 | 0.0017164 |
| text133 | -0.0421681 | 0.0251075 | 0.0050350 | 0.0136960 |
| text134 | -0.0469385 | 0.0399387 | 0.0041594 | -0.0037036 |
| text135 | -0.0298632 | -0.0029811 | -0.0047221 | 0.0059372 |
| text136 | -0.0397649 | 0.0003641 | -0.0033884 | 0.0013429 |
| text137 | -0.0293101 | 0.0124232 | -0.0148709 | 0.0321575 |
| text138 | -0.0058502 | 0.0036290 | -0.0038724 | 0.0008068 |
| text139 | -0.0240327 | 0.0343566 | -0.0024762 | -0.0105157 |
| text140 | -0.0293267 | 0.0235632 | 0.0008046 | -0.0108405 |
| text141 | -0.0105679 | 0.0128090 | -0.0058995 | -0.0066040 |
| text142 | -0.0182028 | 0.0024306 | -0.0441123 | -0.0028671 |
| text143 | -0.0232917 | -0.0057918 | -0.0224714 | 0.0065591 |
| text144 | -0.0281074 | -0.0046934 | -0.0057654 | 0.0096708 |
| text145 | -0.0180302 | 0.0052750 | 0.0059629 | 0.0070135 |
| text146 | -0.0167414 | 0.0068004 | -0.0110631 | 0.0055971 |
| text147 | -0.0040667 | 0.0011578 | -0.0025673 | 0.0005869 |
| text148 | -0.0218604 | 0.0097503 | -0.0027959 | 0.0007421 |
| text149 | -0.0238928 | 0.0398287 | -0.0085369 | 0.0107892 |
| text150 | -0.0191394 | 0.0152123 | -0.0059450 | -0.0018734 |
| text151 | -0.0377778 | 0.0385476 | -0.0055327 | -0.0113984 |
| text152 | -0.0272636 | 0.0058123 | -0.0095221 | 0.0023979 |
| text153 | -0.0259141 | 0.0376867 | -0.0076300 | -0.0142960 |
| text154 | -0.0197460 | -0.0066844 | -0.0222594 | 0.0077663 |
| text155 | -0.0246546 | 0.0096815 | 0.0242707 | 0.0809769 |
| text156 | -0.0363223 | 0.0167559 | 0.0584669 | 0.1488315 |
| text157 | -0.0359745 | 0.0281995 | 0.0355989 | 0.1290572 |
| text158 | -0.0003100 | -0.0003315 | 0.0004466 | 0.0001506 |
| text159 | -0.0141835 | -0.0052222 | -0.0138384 | -0.0021810 |
| text160 | -0.0129338 | -0.0017204 | -0.0117762 | 0.0070835 |
| text161 | -0.0171682 | 0.0196943 | 0.0000753 | -0.0075010 |
| text162 | -0.0179105 | 0.0431996 | 0.0077946 | -0.0208165 |
| text163 | -0.0199926 | 0.0604170 | -0.0009650 | -0.0398460 |
| text164 | -0.0164277 | 0.0113398 | 0.0053861 | -0.0022887 |
| text165 | -0.0151047 | 0.0069685 | -0.0061232 | -0.0013786 |
| text166 | -0.0147898 | 0.0029638 | -0.0001522 | 0.0042710 |
| text167 | -0.0154958 | -0.0227844 | 0.0245828 | -0.0112759 |
| text168 | -0.0019561 | -0.0059643 | 0.0083503 | -0.0046207 |
| text169 | -0.0540385 | 0.0964672 | -0.0238750 | -0.0919114 |
| text170 | -0.0357382 | 0.1033155 | -0.0203808 | -0.0890914 |
| text171 | -0.0506566 | 0.1059461 | -0.0077697 | -0.0852113 |
| text172 | -0.0199858 | 0.0538812 | -0.0075835 | -0.0379010 |
| text173 | -0.0292039 | 0.0311755 | 0.0183368 | 0.1184753 |
| text174 | -0.0426940 | 0.0165574 | 0.0248219 | 0.1161570 |
| text175 | -0.0282166 | 0.0041431 | 0.0306893 | 0.0243437 |
| text176 | -0.0239496 | 0.0366245 | 0.0244173 | 0.1206897 |
| text177 | -0.0261113 | 0.0548723 | 0.0205078 | 0.1006257 |
| text178 | -0.0118952 | 0.0198276 | 0.0068513 | 0.0213746 |
| text179 | -0.0730337 | 0.0754465 | 0.0305581 | 0.0106583 |
| text180 | -0.0394864 | 0.0169127 | 0.0065107 | 0.0162286 |
| text181 | -0.0211651 | -0.0106377 | -0.0053532 | 0.0016129 |
| text182 | -0.0216395 | 0.0253066 | 0.0066810 | -0.0088423 |
| text183 | -0.0187809 | 0.0090976 | 0.0069218 | -0.0014912 |
| text184 | -0.0287576 | 0.0450124 | 0.0095551 | 0.0132355 |
| text185 | -0.0288333 | 0.0407495 | 0.0063370 | -0.0200690 |
| text186 | -0.0254033 | 0.0305906 | 0.0105319 | 0.0149104 |
| text187 | -0.0028981 | -0.0001258 | 0.0002319 | 0.0024810 |
| text188 | -0.0387584 | 0.0595247 | 0.0147159 | -0.0206766 |
| text189 | -0.0248883 | 0.0087208 | 0.0090503 | -0.0120219 |
| text190 | -0.0308973 | 0.0091735 | 0.0142090 | -0.0072949 |
| text191 | -0.0036543 | 0.0006069 | -0.0000831 | -0.0009432 |
| text192 | -0.0371389 | 0.0404968 | -0.0057810 | -0.0059049 |
| text193 | -0.0404662 | 0.0459959 | 0.0037875 | -0.0064073 |
| text194 | -0.0315524 | 0.0379695 | 0.0009668 | -0.0013715 |
| text195 | -0.0275106 | 0.0295415 | 0.0164099 | -0.0098814 |
| text196 | -0.0495898 | 0.0891361 | 0.0011682 | -0.0327189 |
| text197 | -0.0295394 | 0.0511662 | -0.0135244 | -0.0088058 |
| text198 | -0.0227541 | 0.0009795 | -0.0099630 | -0.0090466 |
| text199 | -0.0230107 | -0.0147360 | 0.0055331 | 0.0113032 |
| text200 | -0.0209480 | -0.0045301 | 0.0089428 | 0.0109121 |
| text201 | -0.0266815 | 0.0076831 | 0.0157233 | 0.0134763 |
| text202 | -0.0214656 | 0.0144694 | 0.0012992 | 0.0079005 |
| text203 | -0.0343508 | 0.0192556 | 0.0276279 | 0.0126511 |
| text204 | -0.0226707 | -0.0001894 | 0.0165146 | 0.0050609 |
| text205 | -0.0213267 | 0.0057986 | -0.0027494 | -0.0032255 |
| text206 | -0.0267909 | -0.0311451 | 0.0266560 | -0.0116749 |
| text207 | -0.0296556 | 0.0057585 | 0.0131142 | 0.0067108 |
| text208 | -0.0230199 | -0.0023935 | 0.0070931 | -0.0007709 |
| text209 | -0.0358851 | 0.0913659 | -0.0231707 | -0.0714738 |
| text210 | -0.0363548 | 0.0432097 | -0.0501678 | -0.0497866 |
| text211 | -0.0237017 | 0.0378497 | -0.0194374 | -0.0302199 |
| text212 | -0.0267583 | 0.0182209 | 0.0041828 | -0.0054826 |
| text213 | -0.0382702 | 0.0154236 | 0.0027039 | 0.0176053 |
| text214 | -0.0329014 | 0.0236960 | -0.0212561 | 0.0029021 |
| text215 | -0.0382527 | 0.0457724 | -0.0000462 | 0.0115791 |
| text216 | -0.0303664 | 0.0426079 | 0.0084288 | 0.0174619 |
| text217 | -0.0254046 | -0.0021138 | 0.0003308 | 0.0035142 |
| text218 | -0.0236368 | 0.0045578 | 0.0069398 | -0.0019974 |
| text219 | -0.0337199 | 0.0023399 | -0.0087739 | 0.0092073 |
| text220 | -0.0217051 | -0.0024124 | -0.0073040 | 0.0009637 |
| text221 | -0.0279847 | -0.0012839 | 0.0076873 | -0.0149049 |
| text222 | -0.0346561 | 0.0099346 | 0.0067823 | 0.0056875 |
| text223 | -0.0386104 | 0.0297445 | 0.0224357 | 0.0351533 |
| text224 | -0.0169048 | 0.0142876 | 0.0050891 | 0.0119457 |
| text225 | -0.0280442 | -0.0239659 | 0.0251730 | -0.0109070 |
| text226 | -0.0320561 | -0.0135980 | 0.0037052 | 0.0107657 |
| text227 | -0.0283364 | -0.0129288 | 0.0123105 | -0.0033214 |
| text228 | -0.0496937 | -0.0335365 | 0.0235978 | -0.0144798 |
| text229 | -0.0506091 | -0.0235835 | 0.0369757 | -0.0081879 |
| text230 | -0.0429816 | -0.0150868 | -0.0045988 | -0.0101625 |
| text231 | -0.0184561 | -0.0034977 | 0.0113900 | 0.0004809 |
| text232 | -0.0432677 | 0.0582134 | 0.0065466 | -0.0177905 |
| text233 | -0.0249511 | 0.0470744 | 0.0087902 | -0.0013826 |
| text234 | -0.0313487 | 0.0340880 | 0.0066503 | -0.0071330 |
| text235 | -0.0455340 | 0.0237267 | 0.0279164 | -0.0368089 |
| text236 | -0.0146419 | 0.0250317 | -0.0016594 | -0.0133737 |
| text237 | -0.0272422 | 0.0299257 | 0.0045785 | -0.0013359 |
| text238 | -0.0254705 | 0.0455727 | 0.0011334 | -0.0103612 |
| text239 | -0.0295186 | 0.0220053 | 0.0125004 | -0.0093793 |
| text240 | -0.0249012 | 0.0077798 | 0.0044576 | -0.0038404 |
| text241 | -0.0272629 | 0.0198924 | 0.0011221 | -0.0095154 |
| text242 | -0.0149972 | 0.0002983 | 0.0017334 | -0.0000624 |
| text243 | -0.0321835 | 0.0468390 | -0.0075994 | -0.0153850 |
| text244 | -0.0118048 | 0.0345115 | 0.0017258 | -0.0131031 |
| text245 | -0.0333481 | 0.0054960 | 0.0052575 | 0.0144681 |
| text246 | -0.0290174 | -0.0093952 | -0.0113067 | 0.0142981 |
| text247 | -0.0219416 | 0.0009818 | -0.0164809 | 0.0287223 |
| text248 | -0.0174695 | 0.0058309 | -0.0078095 | 0.0187005 |
| text249 | -0.0221334 | 0.0050318 | -0.0066573 | 0.0117645 |
| text250 | -0.0320514 | -0.0200683 | 0.0064722 | 0.0070370 |
| text251 | -0.0187643 | -0.0106505 | -0.0015961 | -0.0008024 |
| text252 | -0.0142930 | 0.0265515 | 0.0158235 | 0.1112710 |
| text253 | -0.0225281 | 0.0262971 | 0.0162358 | 0.1350033 |
| text254 | -0.0153129 | 0.0326933 | 0.0249880 | 0.1504334 |
| text255 | -0.0188525 | 0.0290820 | 0.0253043 | 0.1452273 |
| text256 | -0.0212003 | 0.0313432 | 0.0278886 | 0.1579571 |
| text257 | -0.0311833 | 0.0408729 | 0.0190014 | 0.1378870 |
| text258 | -0.0086249 | 0.0067156 | 0.0058596 | 0.0389458 |
| text259 | -0.0448198 | 0.0207143 | 0.0184524 | 0.0154998 |
| text260 | -0.0296013 | 0.0155162 | 0.0027490 | 0.0219953 |
| text261 | -0.0175754 | 0.0069976 | -0.0008076 | 0.0133545 |
| text262 | -0.0110823 | 0.0133815 | 0.0019815 | 0.0086488 |
| text263 | -0.0370083 | 0.0214128 | -0.0037474 | 0.0239250 |
| text264 | -0.0270578 | 0.0166270 | 0.0038165 | 0.0234996 |
| text265 | -0.0055194 | 0.0043829 | 0.0000412 | 0.0022743 |
| text266 | -0.0344029 | 0.0452598 | -0.0002820 | 0.0055018 |
| text267 | -0.0233232 | 0.0278186 | 0.0005168 | 0.0006332 |
| text268 | -0.0322342 | 0.0324569 | 0.0048921 | 0.0063477 |
| text269 | -0.0211750 | 0.0431813 | 0.0044757 | -0.0153862 |
| text270 | -0.0059574 | 0.0075959 | 0.0041181 | -0.0022563 |
| text271 | -0.0368308 | 0.0100097 | -0.0143000 | 0.0144517 |
| text272 | -0.0255355 | -0.0076855 | 0.0080946 | 0.0018930 |
| text273 | -0.0429004 | -0.0106305 | -0.0237353 | 0.0053662 |
| text274 | -0.0367460 | -0.0089718 | -0.0204978 | -0.0078798 |
| text275 | -0.0296445 | -0.0021251 | -0.0245614 | 0.0100436 |
| text276 | -0.0065677 | -0.0023253 | -0.0028482 | 0.0010069 |
| text277 | -0.0242471 | 0.0256119 | -0.0044972 | 0.0140766 |
| text278 | -0.0333961 | 0.0127646 | -0.0107096 | 0.0267039 |
| text279 | -0.0216656 | 0.0010919 | 0.0018627 | 0.0007851 |
| text280 | -0.0296621 | 0.0160163 | 0.0097967 | 0.0015650 |
| text281 | -0.0422471 | 0.0484331 | -0.0055017 | -0.0230764 |
| text282 | -0.0300455 | 0.0298419 | -0.0088620 | -0.0216759 |
| text283 | -0.0228110 | 0.0360030 | -0.0063180 | -0.0153464 |
| text284 | -0.0130195 | 0.0024853 | -0.0001097 | -0.0014354 |
| text285 | -0.0191587 | 0.0089330 | -0.0017018 | 0.0004359 |
| text286 | -0.0239856 | 0.0028432 | -0.0059862 | 0.0081256 |
| text287 | -0.0165906 | 0.0115141 | -0.0128037 | -0.0071357 |
| text288 | -0.0202611 | 0.0129806 | -0.0060661 | 0.0003266 |
| text289 | -0.0436336 | 0.0014137 | -0.0219298 | 0.0140062 |
| text290 | -0.0239085 | -0.0167601 | -0.0145332 | 0.0065670 |
| text291 | -0.0333432 | 0.0062700 | -0.0057775 | 0.0040305 |
| text292 | -0.0291566 | -0.0093572 | -0.0059880 | 0.0129509 |
| text293 | -0.0386876 | -0.0001248 | -0.0109655 | -0.0057173 |
| text294 | -0.0229080 | -0.0101698 | 0.0038833 | 0.0011107 |
| text295 | -0.0341748 | 0.0417331 | 0.0084092 | -0.0190869 |
| text296 | -0.0255712 | 0.0458123 | -0.0018898 | -0.0216408 |
| text297 | -0.0223298 | 0.0463810 | 0.0146305 | 0.0158308 |
| text298 | -0.0181942 | 0.0434030 | 0.0184473 | 0.0317746 |
| text299 | -0.0142553 | 0.0076668 | 0.0181290 | 0.0441261 |
| text300 | -0.0264124 | 0.0199616 | 0.0062114 | -0.0099532 |
| text301 | -0.0313245 | 0.0486356 | 0.0330829 | 0.0470726 |
| text302 | -0.0041828 | -0.0007017 | 0.0024321 | 0.0010852 |
| text303 | -0.0296090 | 0.0049336 | -0.0413604 | -0.0043534 |
| text304 | -0.0160931 | 0.0392542 | -0.0028230 | -0.0153412 |
| text305 | -0.0416541 | 0.0104279 | 0.0460834 | 0.1507425 |
| text306 | -0.0408563 | 0.0084444 | 0.0580300 | 0.0918964 |
| text307 | -0.0289957 | 0.0118483 | 0.0256498 | 0.0544986 |
| text308 | -0.0325008 | 0.0123361 | 0.0299135 | 0.0951028 |
| text309 | -0.0363817 | 0.0115965 | 0.0502256 | 0.1129092 |
| text310 | -0.0261317 | -0.0148388 | 0.0045001 | 0.0032153 |
| text311 | -0.0168190 | 0.0108930 | 0.0186856 | 0.0407706 |
| text312 | -0.0210303 | -0.0156995 | 0.0077920 | -0.0068938 |
| text313 | -0.0347061 | -0.0304997 | 0.0308394 | -0.0079987 |
| text314 | -0.0303622 | -0.0061068 | 0.0283298 | -0.0115931 |
| text315 | -0.0391224 | -0.0055718 | 0.0479892 | 0.0460296 |
| text316 | -0.0264747 | -0.0150023 | 0.0171547 | 0.0106166 |
| text317 | -0.0223611 | -0.0152520 | 0.0158392 | -0.0110954 |
| text318 | -0.0297138 | 0.0016341 | 0.0053521 | 0.0039351 |
| text319 | -0.0153070 | -0.0092990 | 0.0199336 | 0.0133619 |
| text320 | -0.0403149 | 0.0264136 | 0.0291102 | 0.1687871 |
| text321 | -0.0130086 | 0.0032251 | 0.0049536 | 0.0373890 |
| text322 | -0.0139683 | 0.0178539 | 0.0098652 | 0.0800696 |
| text323 | -0.0186478 | 0.0237833 | 0.0214813 | 0.1341683 |
| text324 | -0.0128036 | 0.0081222 | 0.0125640 | 0.0375931 |
| text325 | -0.0284548 | 0.0238757 | 0.0243550 | 0.0814224 |
| text326 | -0.0096586 | 0.0135602 | 0.0197740 | 0.0808877 |
| text327 | -0.0183363 | 0.0194645 | 0.0364964 | 0.1241415 |
| text328 | -0.0245758 | 0.0172555 | 0.0256266 | 0.0904241 |
| text329 | -0.0279334 | 0.0130789 | 0.0086507 | 0.0493063 |
| text330 | -0.0063576 | 0.0094594 | 0.0089496 | 0.0547508 |
| text331 | -0.0217510 | -0.0092595 | 0.0150817 | 0.0079834 |
| text332 | -0.0180716 | 0.0114447 | 0.0047712 | -0.0038090 |
| text333 | -0.0135423 | -0.0017924 | 0.0019738 | 0.0079336 |
| text334 | -0.0116413 | -0.0056308 | -0.0061087 | 0.0079262 |
| text335 | -0.0071536 | -0.0011117 | -0.0031055 | 0.0060076 |
| text336 | -0.0099417 | -0.0020191 | 0.0032734 | 0.0077292 |
| text337 | -0.0307648 | 0.0354987 | 0.0062636 | -0.0089653 |
| text338 | -0.0228403 | 0.0133423 | 0.0064934 | -0.0055807 |
| text339 | -0.0305754 | 0.0310809 | 0.0121326 | -0.0105775 |
| text340 | -0.0356297 | 0.0304748 | 0.0087342 | 0.0020671 |
| text341 | -0.0239563 | 0.0347845 | 0.0030371 | -0.0083612 |
| text342 | -0.0359038 | 0.0443493 | 0.0281446 | 0.0667528 |
| text343 | -0.0307947 | 0.0245602 | 0.0232845 | 0.0259830 |
| text344 | -0.0123026 | 0.0084650 | 0.0059225 | 0.0130563 |
| text345 | -0.0354897 | 0.1286508 | -0.0135284 | -0.0780830 |
| text346 | -0.0135051 | 0.0431137 | 0.0033974 | -0.0200889 |
| text347 | -0.0325446 | 0.0460111 | -0.0153598 | -0.0439548 |
| text348 | -0.0251924 | 0.0685377 | -0.0104860 | -0.0446945 |
| text349 | -0.0344841 | 0.0807493 | -0.0104533 | -0.0513765 |
| text350 | -0.0271980 | 0.0873678 | -0.0118493 | -0.0641791 |
| text351 | -0.0234380 | 0.0379345 | 0.0127813 | 0.0101483 |
| text352 | -0.0157604 | 0.0022331 | 0.0138417 | 0.0090903 |
| text353 | -0.0330089 | -0.0004663 | 0.0187482 | 0.0048078 |
| text354 | -0.0183529 | 0.0101495 | 0.0142696 | -0.0038531 |
| text355 | -0.0221006 | -0.0084159 | 0.0022004 | 0.0028804 |
| text356 | -0.0236923 | 0.0031013 | 0.0151404 | 0.0027230 |
| text357 | -0.0195220 | -0.0007685 | 0.0086767 | -0.0033572 |
| text358 | -0.0195181 | -0.0161448 | 0.0225943 | -0.0054572 |
| text359 | -0.0327372 | -0.0150350 | 0.0166779 | 0.0037483 |
| text360 | -0.0203667 | 0.0120326 | 0.0104605 | 0.0049401 |
| text361 | -0.0200897 | 0.0124213 | 0.0023709 | 0.0029208 |
| text362 | -0.0211761 | 0.0025926 | 0.0175322 | 0.0138209 |
| text363 | -0.0348665 | 0.0148250 | 0.0097766 | 0.0127088 |
| text364 | -0.0266001 | -0.0030573 | -0.0055121 | 0.0024116 |
| text365 | -0.0190706 | 0.0023904 | 0.0101108 | -0.0029578 |
| text366 | -0.0212273 | 0.0200925 | 0.0156495 | 0.0398735 |
| text367 | -0.0213545 | -0.0008120 | 0.0074523 | 0.0109766 |
| text368 | -0.0264685 | -0.0081237 | -0.0251891 | -0.0063523 |
| text369 | -0.0286666 | -0.0157278 | 0.0104813 | -0.0175061 |
| text370 | -0.0432260 | -0.0016926 | -0.0184843 | -0.0108937 |
| text371 | -0.0260754 | 0.0087061 | 0.0184819 | 0.0305493 |
| text372 | -0.0425373 | 0.0371114 | -0.0063545 | -0.0127708 |
| text373 | -0.0473579 | 0.0376652 | 0.0051151 | 0.0322850 |
| text374 | -0.0369505 | 0.0116064 | 0.0090513 | 0.0160921 |
| text375 | -0.0569224 | 0.0568404 | -0.0066457 | 0.0183375 |
| text376 | -0.0646091 | 0.0961761 | -0.0118323 | -0.0560647 |
| text377 | -0.0700701 | 0.0664771 | -0.0137508 | -0.0392930 |
| text378 | -0.0299172 | 0.0207730 | 0.0018152 | -0.0022358 |
| text379 | -0.0282058 | 0.0495351 | -0.0062094 | 0.0010556 |
| text380 | -0.0341350 | 0.0532999 | -0.0046142 | -0.0107049 |
| text381 | -0.0300670 | 0.0313113 | -0.0034744 | -0.0013894 |
| text382 | -0.0214688 | 0.0225820 | -0.0044302 | 0.0009382 |
| text383 | -0.0270961 | 0.0088370 | -0.0055345 | 0.0052084 |
| text384 | -0.0281556 | -0.0066100 | -0.0175832 | 0.0017005 |
| text385 | -0.0287097 | -0.0151839 | -0.0246721 | 0.0014328 |
| text386 | -0.0366388 | 0.0015326 | -0.0002089 | 0.0037410 |
| text387 | -0.0306902 | -0.0039006 | 0.0032919 | -0.0040286 |
| text388 | -0.0294701 | 0.0444463 | -0.0037696 | 0.0012072 |
| text389 | -0.0307372 | 0.0536787 | -0.0026898 | -0.0046930 |
| text390 | -0.0115661 | 0.0159137 | -0.0013489 | -0.0032501 |
| text391 | -0.0332552 | 0.0118787 | -0.0122497 | 0.0118911 |
| text392 | -0.0392987 | 0.0462204 | -0.0035552 | 0.0116284 |
| text393 | -0.0311379 | -0.0218903 | 0.0194090 | 0.0043896 |
| text394 | -0.0396791 | 0.0091987 | 0.0261801 | -0.0009170 |
| text395 | -0.0095760 | -0.0025807 | 0.0022668 | 0.0028780 |
| text396 | -0.0268503 | -0.0104450 | 0.0100823 | -0.0070001 |
| text397 | -0.0332552 | 0.0118787 | -0.0122497 | 0.0118911 |
| text398 | -0.0392987 | 0.0462204 | -0.0035552 | 0.0116284 |
| text399 | -0.0405396 | 0.1034778 | -0.0073124 | -0.0888108 |
| text400 | -0.0283061 | 0.0478516 | 0.0110753 | -0.0490471 |
| text401 | -0.0429285 | 0.0910602 | 0.0104452 | -0.0723139 |
| text402 | -0.0366631 | 0.0706994 | -0.0092534 | -0.0669935 |
| text403 | -0.0100288 | 0.0308001 | -0.0001222 | -0.0126220 |
| text404 | -0.0405396 | 0.1034778 | -0.0073124 | -0.0888108 |
| text405 | -0.0283061 | 0.0478516 | 0.0110753 | -0.0490471 |
| text406 | -0.0429285 | 0.0910602 | 0.0104452 | -0.0723139 |
| text407 | -0.0366631 | 0.0706994 | -0.0092534 | -0.0669935 |
| text408 | -0.0100288 | 0.0308001 | -0.0001222 | -0.0126220 |
| text409 | -0.0337391 | 0.0209710 | 0.0086957 | 0.0041487 |
| text410 | -0.0346790 | -0.0067287 | 0.0070609 | 0.0172185 |
| text411 | -0.0367499 | 0.0120897 | -0.0198632 | -0.0040335 |
| text412 | -0.0312820 | 0.0647975 | 0.0037296 | -0.0211232 |
| text413 | -0.0266962 | 0.0027072 | 0.0012108 | 0.0009510 |
| text414 | -0.0195407 | -0.0143951 | 0.0071137 | -0.0052538 |
| text415 | -0.0338267 | 0.0086321 | 0.0225938 | 0.0492075 |
| text416 | -0.0302582 | 0.0530019 | 0.0316096 | 0.1017148 |
| text417 | -0.0370425 | 0.0305745 | 0.0159072 | 0.0609016 |
| text418 | -0.0462059 | -0.0120178 | 0.0639893 | 0.0650952 |
| text419 | -0.0394478 | 0.0478282 | 0.0446952 | 0.1537351 |
| text420 | -0.0479703 | 0.0368636 | 0.0684207 | 0.2046868 |
| text421 | -0.0066810 | 0.0157481 | 0.0093707 | 0.0278680 |
| text422 | -0.0473852 | 0.0572153 | -0.0231923 | -0.0095048 |
| text423 | -0.0152923 | 0.0155680 | -0.0072106 | -0.0045890 |
| text424 | -0.0241674 | 0.0240188 | 0.0203648 | 0.0837848 |
| text425 | -0.0111251 | 0.0053358 | -0.0006045 | 0.0190264 |
| text426 | -0.0107264 | 0.0013180 | 0.0058558 | 0.0089646 |
| text427 | -0.0046780 | 0.0023243 | 0.0028352 | 0.0121811 |
| text428 | -0.0238203 | 0.0301418 | -0.0040600 | 0.0012652 |
| text429 | -0.0299737 | 0.0331940 | -0.0125905 | -0.0028687 |
| text430 | -0.0258066 | 0.0481012 | -0.0137355 | -0.0095295 |
| text431 | -0.0394832 | 0.0445706 | -0.0017147 | -0.0199814 |
| text432 | -0.0201531 | 0.0107942 | -0.0117829 | -0.0089198 |
| text433 | -0.0171276 | 0.0071585 | -0.0002047 | -0.0159465 |
| text434 | -0.0216889 | 0.0092431 | -0.0001597 | 0.0219725 |
| text435 | -0.0073847 | 0.0114532 | -0.0006144 | 0.0120542 |
| text436 | -0.0309847 | -0.0015331 | -0.0312250 | 0.0061257 |
| text437 | -0.0219487 | 0.0030105 | -0.0244118 | 0.0057404 |
| text438 | -0.0088601 | -0.0025539 | -0.0100425 | 0.0017996 |
| text439 | -0.0371004 | 0.0483980 | 0.0100953 | 0.0396477 |
| text440 | -0.0358446 | 0.0590119 | 0.0441144 | 0.2252310 |
| text441 | -0.0499432 | 0.0614367 | 0.0516401 | 0.2260015 |
| text442 | -0.0350552 | 0.0726681 | -0.0251659 | -0.0269170 |
| text443 | -0.0194452 | 0.0372956 | -0.0164398 | -0.0122970 |
| text444 | -0.0214506 | -0.0081914 | 0.0125764 | 0.0080714 |
| text445 | -0.0236850 | 0.0221755 | 0.0006917 | 0.0313481 |
| text446 | -0.0190810 | 0.0178374 | 0.0046222 | 0.0143938 |
| text447 | -0.0203430 | 0.0085733 | -0.0114350 | -0.0002909 |
| text448 | -0.0244509 | 0.0075367 | -0.0041391 | 0.0054034 |
| text449 | -0.0148101 | -0.0032843 | 0.0058191 | 0.0064620 |
| text450 | -0.0270224 | 0.0043103 | 0.0169224 | 0.0568826 |
| text451 | -0.0186529 | -0.0074058 | 0.0088991 | 0.0045876 |
| text452 | -0.0171425 | 0.0058232 | -0.0027934 | 0.0126512 |
| text453 | -0.0159384 | -0.0022181 | 0.0027496 | 0.0131225 |
| text454 | -0.0296338 | -0.0156502 | 0.0271330 | -0.0001213 |
| text455 | -0.0158682 | 0.0209039 | 0.0120453 | 0.0934968 |
| text456 | -0.0186116 | 0.0132201 | 0.0136023 | 0.0618765 |
| text457 | -0.0233097 | 0.0147221 | 0.0084602 | 0.0821709 |
| text458 | -0.0196028 | 0.0229653 | 0.0139565 | 0.0921477 |
| text459 | -0.0154622 | 0.0081074 | -0.0026541 | 0.0396147 |
| text460 | -0.0154596 | 0.0231746 | 0.0100512 | 0.0632254 |
| text461 | -0.0167586 | 0.0094447 | -0.0020943 | 0.0585414 |
| text462 | -0.0161633 | 0.0010374 | 0.0074810 | 0.0231924 |
| text463 | -0.0221837 | -0.0016977 | 0.0092307 | 0.0155437 |
| text464 | -0.0124711 | 0.0042698 | 0.0025042 | 0.0032628 |
| text465 | -0.0175301 | -0.0148497 | 0.0051826 | -0.0030310 |
| text466 | -0.0228017 | -0.0017837 | -0.0151091 | -0.0024748 |
| text467 | -0.0381797 | 0.0046105 | 0.0029785 | -0.0059893 |
| text468 | -0.0370565 | -0.0065224 | 0.0154765 | 0.0075303 |
| text469 | -0.0157575 | 0.0178529 | 0.0089799 | 0.0005893 |
| text470 | -0.0227737 | -0.0036131 | 0.0139331 | 0.0000915 |
| text471 | -0.0239579 | 0.0176869 | -0.0004798 | 0.0061724 |
| text472 | -0.0447799 | 0.0693401 | -0.0017255 | -0.0019128 |
| text473 | -0.0288574 | 0.0202104 | 0.0021176 | -0.0096441 |
| text474 | -0.0248917 | 0.0056054 | 0.0050577 | 0.0032272 |
| text475 | -0.0182068 | 0.0081161 | 0.0085920 | 0.0009705 |
| text476 | -0.0135282 | 0.0028210 | 0.0046034 | 0.0036021 |
| text477 | -0.0307343 | 0.0243107 | 0.0090646 | -0.0004756 |
| text478 | -0.0157426 | -0.0050293 | 0.0001533 | 0.0000325 |
| text479 | -0.0362705 | 0.0337349 | 0.0124329 | 0.0108838 |
| text480 | -0.0319205 | 0.0195396 | 0.0057453 | -0.0014155 |
| text481 | -0.0436740 | 0.0013801 | 0.0185265 | 0.0163460 |
| text482 | -0.0350499 | 0.0070427 | 0.0026826 | 0.0045289 |
| text483 | -0.0295934 | 0.0058001 | 0.0063015 | -0.0087619 |
| text484 | -0.0256519 | -0.0156483 | 0.0056991 | -0.0032405 |
| text485 | -0.0313463 | 0.0012904 | -0.0021937 | 0.0054997 |
| text486 | -0.0203515 | 0.0168011 | -0.0079139 | -0.0002280 |
| text487 | -0.0361894 | 0.0296649 | 0.0083715 | -0.0076515 |
| text488 | -0.0280840 | 0.0140795 | -0.0087856 | -0.0028778 |
| text489 | -0.0375237 | 0.0046811 | 0.0072350 | -0.0040557 |
| text490 | -0.0197599 | 0.0022087 | -0.0010141 | 0.0039289 |
| text491 | -0.0217960 | -0.0066011 | 0.0018197 | 0.0047320 |
| text492 | -0.0373355 | -0.0063200 | 0.0112068 | -0.0026115 |
| text493 | -0.0277355 | -0.0070162 | 0.0131106 | 0.0002326 |
| text494 | -0.0283453 | -0.0023749 | 0.0135732 | 0.0005430 |
| text495 | -0.0276783 | -0.0063803 | 0.0097049 | 0.0056756 |
| text496 | -0.0110301 | -0.0047188 | 0.0049469 | 0.0013836 |
| text497 | -0.0244848 | -0.0101101 | 0.0172853 | -0.0078221 |
| text498 | -0.0217014 | 0.0150337 | 0.0063324 | -0.0011809 |
| text499 | -0.0144972 | 0.0013937 | 0.0025498 | -0.0045414 |
| text500 | -0.0094114 | 0.0086508 | -0.0005578 | -0.0050213 |
| text501 | -0.0140966 | -0.0040265 | 0.0047493 | -0.0056302 |
| text502 | -0.0045781 | -0.0024669 | 0.0041396 | 0.0005549 |
| text503 | -0.0233787 | 0.0796565 | 0.0136267 | -0.0177833 |
| text504 | -0.0114724 | 0.0114057 | 0.0006780 | 0.0045538 |
| text505 | -0.0281060 | 0.0237340 | 0.0074736 | 0.0586374 |
| text506 | -0.0148976 | 0.0145505 | 0.0087224 | 0.0395671 |
| text507 | -0.0263081 | 0.0389945 | 0.0065692 | -0.0029849 |
| text508 | -0.0347503 | 0.0424671 | -0.0022325 | 0.0084769 |
| text509 | -0.0374911 | 0.0198416 | -0.0045365 | 0.0104727 |
| text510 | -0.0248813 | 0.0242756 | -0.0006424 | 0.0059742 |
| text511 | -0.0237294 | 0.0261954 | 0.0081201 | 0.0029460 |
| text512 | -0.0250632 | -0.0092383 | 0.0166832 | -0.0057769 |
| text513 | -0.0361599 | 0.0252441 | 0.0085145 | -0.0052140 |
| text514 | -0.0386659 | 0.0534309 | 0.0132422 | -0.0280940 |
| text515 | -0.0442523 | 0.0306240 | 0.0081311 | 0.0006005 |
| text516 | -0.0244035 | -0.0038724 | -0.0057477 | 0.0041367 |
| text517 | -0.0284411 | 0.0083439 | 0.0181926 | -0.0049940 |
| text518 | -0.0193366 | 0.0022266 | -0.0009657 | 0.0020277 |
| text519 | -0.0096786 | -0.0049361 | 0.0003806 | 0.0033873 |
| text520 | -0.0272429 | 0.0146284 | -0.0065132 | 0.0117403 |
| text521 | -0.0337371 | -0.0005542 | -0.0038355 | -0.0007187 |
| text522 | -0.0247830 | 0.0299070 | -0.0134166 | 0.0011380 |
| text523 | -0.0146033 | 0.0157690 | -0.0029678 | -0.0022481 |
| text524 | -0.0132432 | -0.0007095 | 0.0069490 | 0.0081387 |
| text525 | -0.0350591 | -0.0229517 | 0.0117507 | -0.0074991 |
| text526 | -0.0179408 | 0.0024300 | 0.0044475 | 0.0060647 |
| text527 | -0.0068144 | 0.0012709 | -0.0040186 | 0.0002826 |
| text528 | -0.0382889 | 0.0373283 | 0.0063939 | -0.0074360 |
| text529 | -0.0283461 | 0.0160663 | 0.0091570 | -0.0074812 |
| text530 | -0.0177400 | 0.0061002 | -0.0104736 | -0.0022727 |
| text531 | -0.0358196 | 0.0140757 | 0.0081749 | -0.0150437 |
| text532 | -0.0191037 | 0.0131285 | 0.0012857 | -0.0112855 |
| text533 | -0.0299113 | 0.0270838 | 0.0067707 | -0.0025893 |
| text534 | -0.0396987 | 0.0079476 | 0.0163968 | -0.0228786 |
| text535 | -0.0080157 | -0.0013342 | 0.0075127 | -0.0052591 |
| text536 | -0.0253852 | 0.0108708 | -0.0334686 | 0.0089681 |
| text537 | -0.0205128 | 0.0145769 | -0.0339513 | 0.0085667 |
| text538 | -0.0211721 | 0.0079518 | -0.0283339 | 0.0094322 |
| text539 | -0.0295774 | 0.0121331 | 0.0072455 | 0.0054973 |
| text540 | -0.0313976 | 0.0446552 | 0.0062093 | -0.0048568 |
| text541 | -0.0386684 | 0.0851508 | 0.0052531 | -0.0213871 |
| text542 | -0.0202384 | 0.0175684 | -0.0029087 | -0.0022661 |
| text543 | -0.0008034 | -0.0006098 | -0.0005049 | -0.0002135 |
| text544 | -0.0204554 | 0.0230374 | 0.0110221 | 0.0322704 |
| text545 | -0.0206951 | 0.0162447 | 0.0113845 | 0.0192862 |
| text546 | -0.0188548 | 0.0034648 | 0.0074926 | -0.0040911 |
| text547 | -0.0192323 | 0.0019052 | -0.0076023 | 0.0044758 |
| text548 | -0.0275417 | 0.0103989 | 0.0034622 | 0.0034630 |
| text549 | -0.0298303 | -0.0044245 | 0.0128477 | 0.0025581 |
| text550 | -0.0195824 | 0.0115723 | 0.0028259 | -0.0112475 |
| text551 | -0.0231336 | 0.0368448 | -0.0155092 | -0.0007508 |
| text552 | -0.0194409 | 0.0274285 | -0.0078904 | -0.0008303 |
| text553 | -0.0023700 | 0.0057738 | 0.0000104 | -0.0019219 |
| text554 | -0.0231283 | -0.0092821 | 0.0084314 | 0.0076335 |
| text555 | -0.0288619 | 0.0113309 | -0.0027968 | -0.0005844 |
| text556 | -0.0212677 | 0.0053466 | 0.0236614 | 0.0328677 |
| text557 | -0.0092966 | 0.0028491 | -0.0008229 | -0.0001201 |
| text558 | -0.0297835 | 0.0431637 | -0.0048521 | 0.0043531 |
| text559 | -0.0169191 | 0.0267818 | 0.0021461 | -0.0074446 |
| text560 | -0.0176403 | -0.0014013 | 0.0021386 | 0.0317120 |
| text561 | -0.0295844 | 0.0430173 | -0.0171222 | -0.0259421 |
| text562 | -0.0139175 | 0.0181945 | -0.0029633 | -0.0155984 |
| text563 | -0.0305884 | 0.0047881 | -0.0012311 | 0.0052565 |
| text564 | -0.0324379 | -0.0079341 | -0.0033075 | 0.0000155 |
| text565 | -0.0209217 | 0.0015413 | -0.0108941 | 0.0080173 |
| text566 | -0.0239032 | -0.0185531 | -0.0150049 | -0.0038569 |
| text567 | -0.0303902 | -0.0051409 | -0.0147514 | -0.0126429 |
| text568 | -0.0265311 | -0.0052810 | -0.0181788 | -0.0108556 |
| text569 | -0.0033218 | 0.0024654 | -0.0036952 | 0.0011511 |
| text570 | -0.0030473 | -0.0022839 | -0.0008504 | 0.0003877 |
| text571 | -0.0254196 | -0.0067455 | -0.0543033 | 0.0034295 |
| text572 | -0.0415997 | -0.0009183 | -0.0953894 | 0.0166498 |
| text573 | -0.0310405 | -0.0125625 | -0.0619324 | 0.0099596 |
| text574 | -0.0349730 | -0.0105032 | -0.0755864 | 0.0109744 |
| text575 | -0.0308954 | -0.0157798 | -0.0921386 | 0.0180388 |
| text576 | -0.0314823 | -0.0189713 | -0.0893208 | 0.0088889 |
| text577 | -0.0255856 | -0.0117152 | -0.0721183 | 0.0058877 |
| text578 | -0.0103477 | 0.0038126 | -0.0091717 | -0.0043217 |
| text579 | -0.0198706 | -0.0107579 | -0.0407816 | 0.0070927 |
| text580 | -0.0116400 | -0.0056292 | -0.0352562 | 0.0031821 |
| text581 | -0.0247634 | -0.0090350 | -0.0676090 | 0.0071083 |
| text582 | -0.0134091 | -0.0074782 | -0.0435476 | 0.0044877 |
| text583 | -0.0143432 | -0.0010923 | -0.0045790 | 0.0045171 |
| text584 | -0.0162741 | -0.0055155 | -0.0018892 | 0.0094098 |
| text585 | -0.0107053 | -0.0009944 | -0.0014692 | 0.0037428 |
| text586 | -0.0200008 | -0.0152915 | -0.0105590 | 0.0066612 |
| text587 | -0.0277205 | -0.0178881 | -0.0153193 | 0.0117347 |
| text588 | -0.0183666 | 0.0046862 | 0.0058478 | 0.0023793 |
| text589 | -0.0095955 | -0.0006155 | -0.0000125 | 0.0063702 |
| text590 | -0.0195269 | -0.0162399 | 0.0092653 | 0.0075958 |
| text591 | -0.0115605 | -0.0060995 | -0.0057652 | 0.0031906 |
| text592 | -0.0089238 | -0.0083192 | 0.0036805 | -0.0007627 |
| text593 | -0.0297424 | -0.0186604 | -0.0428669 | 0.0114051 |
| text594 | -0.0189275 | -0.0033591 | -0.0346528 | 0.0071383 |
| text595 | -0.0199380 | -0.0005791 | -0.0185516 | 0.0089402 |
| text596 | -0.0245511 | -0.0063465 | -0.0424018 | 0.0089686 |
| text597 | -0.0173461 | -0.0022581 | -0.0179175 | -0.0004196 |
| text598 | -0.0096314 | -0.0027804 | -0.0192970 | 0.0064091 |
| text599 | -0.0131991 | -0.0136217 | -0.0167592 | 0.0050473 |
| text600 | -0.0115226 | -0.0150708 | -0.0097294 | 0.0005793 |
| text601 | -0.0296166 | -0.0368766 | -0.0282027 | -0.0168263 |
| text602 | -0.0243948 | -0.0199095 | -0.0369116 | -0.0040587 |
| text603 | -0.0151477 | -0.0139138 | -0.0295315 | 0.0024006 |
| text604 | -0.0351232 | 0.0178090 | -0.0146494 | -0.0022326 |
| text605 | -0.0675604 | -0.0431899 | -0.0261517 | -0.0112327 |
| text606 | -0.0401909 | -0.0361184 | -0.0245238 | -0.0085289 |
| text607 | -0.0037537 | -0.0027434 | 0.0016946 | -0.0012943 |
| text608 | -0.0368044 | -0.0392150 | -0.0921383 | 0.0064083 |
| text609 | -0.0313774 | -0.0438785 | -0.0945298 | -0.0039105 |
| text610 | -0.0084325 | -0.0095961 | -0.0203220 | 0.0011167 |
| text611 | -0.0178532 | -0.0109898 | -0.0411616 | 0.0118437 |
| text612 | -0.0198154 | -0.0143282 | -0.0209815 | 0.0093238 |
| text613 | -0.0186866 | -0.0016228 | -0.0173361 | 0.0107335 |
| text614 | -0.0182913 | -0.0060177 | -0.0170819 | 0.0108766 |
| text615 | -0.0240965 | -0.0150466 | -0.0302058 | 0.0116094 |
| text616 | -0.0157389 | -0.0107000 | -0.0334775 | 0.0123625 |
| text617 | -0.0200863 | -0.0199989 | -0.0019882 | -0.0080237 |
| text618 | -0.0148695 | -0.0106523 | -0.0142435 | 0.0014174 |
| text619 | -0.0220460 | -0.0197333 | -0.0112958 | 0.0024681 |
| text620 | -0.0033901 | -0.0014542 | 0.0014396 | 0.0034213 |
| text621 | -0.0282344 | -0.0207389 | -0.0542786 | 0.0117646 |
| text622 | -0.0207476 | -0.0140791 | -0.0439007 | 0.0024167 |
| text623 | -0.0255209 | -0.0172804 | -0.0603191 | 0.0180679 |
| text624 | -0.0132388 | 0.0026903 | -0.0247778 | 0.0028687 |
| text625 | -0.0057356 | -0.0012373 | -0.0070578 | 0.0011352 |
| text626 | -0.0189546 | -0.0129884 | -0.0249334 | 0.0041325 |
| text627 | -0.0239931 | -0.0118835 | -0.0129099 | -0.0093483 |
| text628 | -0.0290676 | -0.0318587 | 0.0431696 | -0.0204419 |
| text629 | -0.0186308 | -0.0181669 | 0.0043220 | -0.0031709 |
| text630 | -0.0176434 | -0.0086581 | -0.0344405 | 0.0021770 |
| text631 | -0.0075696 | -0.0078846 | -0.0105753 | 0.0019422 |
| text632 | -0.0008749 | -0.0026853 | 0.0033327 | -0.0017533 |
| text633 | -0.0261941 | -0.0213248 | -0.0532772 | 0.0160753 |
| text634 | -0.0249747 | -0.0194800 | -0.0457917 | 0.0057984 |
| text635 | -0.0283240 | -0.0105858 | -0.0377137 | 0.0034831 |
| text636 | -0.0295568 | -0.0132815 | -0.0231233 | 0.0012943 |
| text637 | -0.0240302 | -0.0105213 | -0.0340723 | 0.0093925 |
| text638 | -0.0284796 | -0.0000473 | -0.0424607 | 0.0019301 |
| text639 | -0.0263840 | -0.0150507 | -0.0081622 | -0.0076650 |
| text640 | -0.0249792 | -0.0192842 | -0.0064666 | -0.0078058 |
| text641 | -0.0278571 | -0.0236758 | -0.0165348 | -0.0083198 |
| text642 | -0.0253352 | -0.0069309 | -0.0562657 | 0.0082917 |
| text643 | -0.0393072 | -0.0168197 | -0.0695601 | 0.0069265 |
| text644 | -0.0286973 | -0.0176591 | -0.0245834 | 0.0047949 |
| text645 | -0.0208895 | -0.0149460 | -0.0246508 | -0.0005831 |
| text646 | -0.0219232 | -0.0147324 | -0.0403502 | 0.0063467 |
| text647 | -0.0290849 | -0.0134441 | -0.0387282 | 0.0044807 |
| text648 | -0.0334020 | -0.0098934 | -0.0281736 | 0.0078711 |
| text649 | -0.0196424 | -0.0026943 | -0.0187465 | 0.0093467 |
| text650 | -0.0070581 | -0.0030142 | -0.0174211 | 0.0030135 |
| text651 | -0.0152581 | -0.0048516 | -0.0204816 | -0.0004302 |
| text652 | -0.0095122 | 0.0010183 | -0.0100317 | -0.0051779 |
| text653 | -0.0288682 | 0.0009660 | -0.0271674 | -0.0019224 |
| text654 | -0.0257502 | -0.0021454 | -0.0388394 | 0.0055186 |
| text655 | -0.0214504 | -0.0004040 | -0.0191744 | 0.0053735 |
| text656 | -0.0260875 | -0.0041983 | -0.0508886 | 0.0042876 |
| text657 | -0.0161964 | -0.0139898 | 0.0095276 | 0.0005718 |
| text658 | -0.0156901 | -0.0090134 | -0.0078428 | 0.0004437 |
| text659 | -0.0177932 | -0.0010995 | -0.0077549 | -0.0038378 |
| text660 | -0.0187756 | -0.0111673 | -0.0204059 | 0.0066267 |
| text661 | -0.0188095 | -0.0086335 | -0.0408532 | 0.0036526 |
| text662 | -0.0148407 | -0.0005660 | -0.0113651 | -0.0024980 |
| text663 | -0.0186676 | -0.0132838 | -0.0070143 | 0.0031546 |
| text664 | -0.0254753 | -0.0154377 | -0.0070949 | 0.0001703 |
| text665 | -0.0167038 | -0.0119089 | -0.0167509 | 0.0061930 |
| text666 | -0.0317447 | -0.0362272 | -0.0002603 | -0.0121726 |
| text667 | -0.0293818 | -0.0191133 | 0.0087751 | -0.0086362 |
| text668 | -0.0131026 | -0.0102468 | 0.0015940 | -0.0026876 |
| text669 | -0.0353454 | -0.0126038 | -0.0417173 | 0.0110216 |
| text670 | -0.0363668 | -0.0295760 | -0.0574306 | 0.0075461 |
| text671 | -0.0309917 | -0.0218167 | -0.0140300 | 0.0049092 |
| text672 | -0.0313895 | -0.0296187 | -0.0312700 | 0.0117982 |
| text673 | -0.0193417 | -0.0162216 | 0.0038298 | -0.0115741 |
| text674 | -0.0261702 | -0.0215936 | -0.0170963 | 0.0027762 |
| text675 | -0.0121180 | -0.0052675 | -0.0124529 | 0.0124410 |
| text676 | -0.0068599 | -0.0045926 | -0.0055498 | 0.0039351 |
| text677 | -0.0318787 | -0.0294245 | -0.0170104 | -0.0138988 |
| text678 | -0.0212047 | -0.0067840 | -0.0043575 | -0.0009925 |
| text679 | -0.0183741 | -0.0063826 | -0.0099789 | 0.0051525 |
| text680 | -0.0148980 | -0.0007936 | -0.0049701 | 0.0021260 |
| text681 | -0.0306816 | -0.0153864 | -0.0239179 | 0.0020284 |
| text682 | -0.0429960 | -0.0258537 | -0.0337596 | 0.0021500 |
| text683 | -0.0316991 | -0.0223333 | -0.0164749 | -0.0051337 |
| text684 | -0.0406882 | -0.0238624 | -0.0056043 | -0.0080043 |
| text685 | -0.0379843 | -0.0251395 | 0.0117592 | 0.0182424 |
| text686 | -0.0033700 | 0.0007964 | -0.0014339 | -0.0009769 |
| text687 | -0.0178345 | -0.0125642 | -0.0437031 | 0.0086577 |
| text688 | -0.0115835 | -0.0146096 | -0.0517722 | 0.0079639 |
| text689 | -0.0256443 | -0.0213330 | 0.0020058 | -0.0020106 |
| text690 | -0.0197332 | -0.0056357 | -0.0040980 | -0.0012912 |
| text691 | -0.0163676 | -0.0131321 | -0.0327350 | 0.0062415 |
| text692 | -0.0251954 | -0.0095519 | -0.0184438 | 0.0103278 |
| text693 | -0.0116446 | -0.0076854 | -0.0245494 | 0.0051851 |
| text694 | -0.0142326 | -0.0124860 | -0.0214792 | 0.0062627 |
| text695 | -0.0126779 | -0.0061777 | -0.0174143 | 0.0048102 |
| text696 | -0.0110865 | -0.0051147 | -0.0062890 | -0.0047998 |
| text697 | -0.0331160 | -0.0411410 | -0.0718921 | 0.0062145 |
| text698 | -0.0239055 | -0.0235935 | -0.0554600 | 0.0007555 |
| text699 | -0.0272024 | -0.0147750 | -0.0129421 | -0.0055075 |
| text700 | -0.0065044 | -0.0056978 | -0.0027511 | -0.0017365 |
| text701 | -0.0217810 | -0.0095650 | -0.0219531 | 0.0075208 |
| text702 | -0.0440649 | -0.0509003 | 0.0323016 | -0.0209707 |
| text703 | -0.0201578 | -0.0210322 | -0.0027449 | -0.0042914 |
| text704 | -0.0247688 | -0.0231683 | 0.0054968 | -0.0029269 |
| text705 | -0.0293377 | -0.0116425 | -0.0203417 | 0.0013249 |
| text706 | -0.0051347 | -0.0060254 | 0.0041318 | -0.0013946 |
| text707 | -0.0163285 | -0.0166535 | -0.0334281 | 0.0050814 |
| text708 | -0.0243203 | -0.0253437 | -0.0212124 | -0.0014201 |
| text709 | -0.0438015 | -0.0450725 | -0.0796292 | -0.0059414 |
| text710 | -0.0279042 | -0.0119150 | -0.0292455 | 0.0095119 |
| text711 | -0.0215161 | -0.0199165 | -0.0340533 | 0.0041594 |
| text712 | -0.0252455 | -0.0305333 | -0.0536555 | 0.0024081 |
| text713 | -0.0169979 | -0.0111005 | -0.0372485 | 0.0054777 |
| text714 | -0.0213001 | -0.0079848 | -0.0317699 | 0.0059306 |
| text715 | -0.0273414 | 0.0023104 | -0.0332481 | 0.0136908 |
| text716 | -0.0507055 | 0.0224583 | -0.0271341 | 0.0388487 |
| text717 | -0.0136097 | 0.0006070 | -0.0057396 | 0.0094533 |
| text718 | -0.0253773 | -0.0267630 | -0.0511202 | 0.0024740 |
| text719 | -0.0301538 | -0.0395669 | -0.0180177 | -0.0154648 |
| text720 | -0.0250349 | -0.0207119 | -0.0474611 | -0.0074283 |
| text721 | -0.0196365 | -0.0195552 | -0.0484179 | 0.0032233 |
| text722 | -0.0202230 | -0.0064905 | 0.0033976 | 0.0126435 |
| text723 | -0.0248914 | -0.0078798 | -0.0025009 | 0.0110019 |
| text724 | -0.0251618 | -0.0059128 | -0.0021819 | 0.0146205 |
| text725 | -0.0141406 | -0.0039526 | -0.0083820 | 0.0096963 |
| text726 | -0.0240566 | -0.0177577 | -0.0262312 | 0.0116534 |
| text727 | -0.0125605 | -0.0097599 | -0.0136072 | 0.0092323 |
| text728 | -0.0217281 | -0.0161630 | -0.0106272 | 0.0001574 |
| text729 | -0.0182872 | -0.0054763 | -0.0007325 | -0.0010461 |
| text730 | -0.0002734 | -0.0002192 | 0.0000961 | 0.0000464 |
| text731 | -0.0103761 | -0.0079662 | -0.0318503 | 0.0030332 |
| text732 | -0.0307515 | -0.0321830 | -0.0271880 | -0.0026653 |
| text733 | -0.0217375 | -0.0173775 | -0.0371818 | 0.0052425 |
| text734 | -0.0221838 | -0.0297261 | -0.0248450 | -0.0028684 |
| text735 | -0.0140497 | -0.0177052 | -0.0489023 | 0.0072818 |
| text736 | -0.0167026 | -0.0157113 | -0.0414246 | 0.0011984 |
| text737 | -0.0235035 | -0.0206541 | -0.0200065 | -0.0046091 |
| text738 | -0.0261288 | -0.0116214 | -0.0261968 | 0.0142357 |
| text739 | -0.0318719 | -0.0266200 | -0.0454607 | 0.0104394 |
| text740 | -0.0055412 | -0.0009467 | -0.0046639 | 0.0018387 |
| text741 | -0.0208389 | -0.0118906 | 0.0060387 | 0.0027179 |
| text742 | -0.0291290 | -0.0059830 | -0.0023843 | 0.0105181 |
| text743 | -0.0328328 | -0.0406291 | 0.0240249 | -0.0195563 |
| text744 | -0.0204997 | -0.0144909 | 0.0111989 | -0.0001391 |
| text745 | -0.0142415 | 0.0053529 | 0.0067015 | 0.0001887 |
| text746 | -0.0089883 | 0.0024959 | -0.0008041 | 0.0005323 |
| text747 | -0.0155982 | -0.0110535 | -0.0403387 | 0.0103321 |
| text748 | -0.0221247 | -0.0217029 | -0.0522801 | 0.0047386 |
| text749 | -0.0149560 | -0.0162470 | -0.0372814 | 0.0035718 |
| text750 | -0.0176823 | -0.0085849 | -0.0386483 | 0.0031987 |
| text751 | -0.0003365 | -0.0002558 | -0.0000199 | 0.0001437 |
| text752 | -0.0238939 | -0.0158828 | -0.0373095 | 0.0096198 |
| text753 | -0.0105210 | -0.0055317 | -0.0161641 | 0.0030725 |
| text754 | -0.0313419 | -0.0277680 | -0.0441326 | 0.0031155 |
| text755 | -0.0308777 | -0.0257353 | -0.0453577 | -0.0032067 |
| text756 | -0.0220349 | -0.0106017 | -0.0133803 | 0.0008087 |
| text757 | -0.0274308 | -0.0286051 | -0.0186775 | -0.0056694 |
| text758 | -0.0398674 | -0.0394068 | -0.0138207 | -0.0081378 |
| text759 | -0.0257214 | -0.0003084 | -0.0633983 | -0.0029752 |
| text760 | -0.0229535 | -0.0144829 | -0.0450480 | 0.0093578 |
| text761 | -0.0259620 | 0.0017904 | -0.0569930 | -0.0054914 |
| text762 | -0.0249985 | -0.0128730 | -0.0412423 | -0.0011730 |
| text763 | -0.0072991 | -0.0026772 | -0.0159665 | 0.0012313 |
| text764 | -0.0226840 | -0.0217047 | 0.0053058 | 0.0004604 |
| text765 | -0.0344423 | -0.0390416 | -0.0094770 | 0.0051668 |
| text766 | -0.0325140 | -0.0032545 | 0.0257782 | -0.0218618 |
| text767 | -0.0318432 | -0.0218706 | -0.0216474 | -0.0077556 |
| text768 | -0.0270420 | -0.0201154 | -0.0097134 | -0.0053867 |
| text769 | -0.0367457 | -0.0293695 | -0.0318452 | 0.0066681 |
| text770 | -0.0063854 | -0.0090677 | 0.0068944 | -0.0043506 |
| text771 | -0.0250981 | -0.0173322 | -0.0607137 | 0.0082321 |
| text772 | -0.0255624 | -0.0267789 | -0.0550091 | 0.0048732 |
| text773 | -0.0300521 | -0.0236439 | -0.0433128 | 0.0066064 |
| text774 | -0.0112523 | -0.0141331 | -0.0375952 | 0.0087280 |
| text775 | -0.0257900 | -0.0139835 | -0.0304923 | 0.0070274 |
| text776 | -0.0246280 | -0.0216434 | -0.0266219 | 0.0041906 |
| text777 | -0.0228470 | -0.0269490 | 0.0068018 | -0.0081006 |
| text778 | -0.0409947 | -0.0277038 | -0.0196627 | -0.0083452 |
| text779 | -0.0114331 | -0.0112488 | -0.0061746 | -0.0033897 |
| text780 | -0.0246879 | -0.0115357 | -0.0099945 | 0.0017486 |
| text781 | -0.0023671 | -0.0025270 | 0.0010213 | -0.0010836 |
| text782 | -0.0287369 | -0.0213773 | -0.0477749 | 0.0143294 |
| text783 | -0.0230022 | -0.0209208 | -0.0410555 | 0.0101349 |
| text784 | -0.0110007 | -0.0065607 | -0.0215179 | 0.0060629 |
| text785 | -0.0234052 | -0.0258630 | -0.0471569 | 0.0052771 |
| text786 | -0.0162431 | -0.0185663 | -0.0396498 | 0.0083648 |
| text787 | -0.0270680 | -0.0239115 | -0.0540500 | 0.0095073 |
| text788 | -0.0204753 | -0.0160414 | -0.0330254 | 0.0033798 |
| text789 | -0.0185104 | -0.0133332 | -0.0285876 | 0.0024876 |
| text790 | -0.0272949 | -0.0252424 | -0.0240566 | 0.0033818 |
| text791 | -0.0300508 | -0.0067493 | -0.0008823 | 0.0081568 |
| text792 | -0.0220451 | -0.0152468 | 0.0043259 | -0.0008541 |
| text793 | -0.0157339 | -0.0158959 | -0.0021959 | 0.0053363 |
| text794 | -0.0167084 | -0.0052941 | -0.0235863 | 0.0111901 |
| text795 | -0.0181417 | -0.0190613 | -0.0555111 | 0.0074162 |
| text796 | -0.0062286 | -0.0071703 | -0.0168161 | 0.0044107 |
| text797 | -0.0181717 | -0.0138710 | -0.0362634 | 0.0128752 |
| text798 | -0.0120627 | -0.0080472 | -0.0235416 | 0.0094299 |
| text799 | -0.0184869 | -0.0159584 | -0.0243669 | 0.0056297 |
| text800 | -0.0095557 | -0.0044872 | -0.0026936 | 0.0054902 |
| text801 | -0.0104461 | -0.0071135 | -0.0089762 | 0.0052013 |
| text802 | -0.0134059 | -0.0065027 | -0.0030741 | 0.0014428 |
| text803 | -0.0236640 | -0.0270549 | -0.0346888 | 0.0016626 |
| text804 | -0.0291386 | -0.0212644 | -0.0529577 | 0.0046548 |
| text805 | -0.0261450 | -0.0228853 | -0.0363999 | 0.0007377 |
| text806 | -0.0231196 | 0.0034307 | -0.0559420 | 0.0069367 |
| text807 | -0.0087509 | 0.0032285 | -0.0110876 | -0.0005103 |
| text808 | -0.0130605 | -0.0161695 | -0.0403477 | 0.0058520 |
| text809 | -0.0281127 | -0.0288329 | -0.0914152 | 0.0179728 |
| text810 | -0.0242249 | -0.0267278 | -0.0615670 | 0.0086141 |
| text811 | -0.0242894 | -0.0131616 | -0.0545373 | 0.0111964 |
| text812 | -0.0249738 | 0.0117238 | -0.0381514 | -0.0117022 |
| text813 | -0.0372078 | 0.0259144 | -0.0647108 | -0.0329797 |
| text814 | -0.0282585 | 0.0141437 | -0.0551581 | -0.0101855 |
| text815 | -0.0142636 | 0.0073679 | -0.0284143 | -0.0034889 |
| text816 | -0.0221739 | 0.0035560 | -0.0097394 | 0.0040574 |
| text817 | -0.0213579 | -0.0080142 | -0.0010782 | 0.0025801 |
| text818 | -0.0110713 | -0.0110087 | -0.0239643 | 0.0050842 |
| text819 | -0.0119998 | -0.0041109 | -0.0008312 | 0.0020238 |
| text820 | -0.0174976 | -0.0176366 | 0.0045640 | -0.0038476 |
| text821 | -0.0265416 | -0.0128678 | -0.0034525 | 0.0045337 |
| text822 | -0.0170792 | -0.0052487 | -0.0160404 | 0.0076725 |
| text823 | -0.0119603 | -0.0073002 | 0.0015167 | 0.0019879 |
| text824 | -0.0088133 | -0.0028057 | -0.0045393 | 0.0035032 |
| text825 | -0.0007254 | -0.0015012 | -0.0047478 | 0.0006645 |
| text826 | -0.0291937 | 0.0007550 | -0.0752154 | 0.0097422 |
| text827 | -0.0258416 | -0.0197223 | -0.0642005 | 0.0171389 |
| text828 | -0.0268698 | 0.0009738 | -0.0561612 | 0.0115985 |
| text829 | -0.0308647 | -0.0140341 | -0.0468128 | -0.0040811 |
| text830 | -0.0140170 | -0.0005162 | -0.0097782 | -0.0019933 |
| text831 | -0.0374563 | -0.0310013 | -0.0184682 | -0.0063199 |
| text832 | -0.0435357 | -0.0205322 | -0.0238136 | 0.0036042 |
| text833 | -0.0410213 | -0.0077093 | -0.0542770 | 0.0015111 |
| text834 | -0.0327276 | -0.0171857 | -0.0773719 | -0.0010295 |
| text835 | -0.0398451 | -0.0076245 | -0.0515509 | 0.0067856 |
| text836 | -0.0174905 | -0.0103094 | -0.0155807 | -0.0008209 |
| text837 | -0.0093658 | -0.0071321 | -0.0206612 | 0.0030554 |
| text838 | -0.0234194 | -0.0063738 | -0.0317421 | 0.0099844 |
| text839 | -0.0174626 | -0.0010600 | -0.0246669 | 0.0095145 |
| text840 | -0.0134556 | 0.0000886 | -0.0248233 | 0.0050131 |
| text841 | -0.0180200 | -0.0162425 | -0.0455879 | 0.0050512 |
| text842 | -0.0178529 | -0.0138989 | -0.0286606 | 0.0016665 |
| text843 | -0.0186503 | -0.0135428 | -0.0439949 | 0.0067326 |
| text844 | -0.0330403 | -0.0219064 | -0.0464891 | 0.0017734 |
| text845 | -0.0347152 | -0.0280631 | -0.0635191 | -0.0042524 |
| text846 | -0.0043405 | -0.0017015 | -0.0078770 | 0.0002209 |
| text847 | -0.0240510 | -0.0086199 | -0.0192074 | -0.0046285 |
| text848 | -0.0299511 | -0.0154345 | -0.0731904 | 0.0081947 |
| text849 | -0.0259783 | -0.0262768 | -0.0825090 | 0.0046876 |
| text850 | -0.0250407 | -0.0227242 | -0.0639937 | 0.0029839 |
| text851 | -0.0002872 | -0.0002396 | 0.0001031 | 0.0000393 |
| text852 | -0.0354031 | -0.0000930 | -0.0477127 | 0.0062030 |
| text853 | -0.0342267 | 0.0075723 | -0.0316688 | 0.0089985 |
| text854 | -0.0332866 | -0.0132643 | -0.0556133 | 0.0104983 |
| text855 | -0.0163143 | -0.0069402 | -0.0080226 | 0.0056928 |
| text856 | -0.0158569 | 0.0033748 | -0.0297363 | 0.0030731 |
| text857 | -0.0226079 | -0.0109817 | -0.0676568 | 0.0139221 |
| text858 | -0.0189206 | -0.0028855 | -0.0625252 | 0.0083838 |
| text859 | -0.0292370 | -0.0080437 | -0.0666998 | 0.0025766 |
| text860 | -0.0339525 | -0.0300232 | -0.0695655 | 0.0027974 |
| text861 | -0.0165379 | -0.0181290 | -0.0252623 | 0.0098393 |
| text862 | -0.0200581 | -0.0191799 | -0.0660759 | 0.0034478 |
| text863 | -0.0355532 | -0.0418502 | -0.0485124 | -0.0129489 |
| text864 | -0.0040197 | -0.0037470 | 0.0027846 | -0.0021060 |
| text865 | -0.0092222 | -0.0104798 | -0.0067766 | -0.0027969 |
| text866 | -0.0210725 | -0.0135337 | -0.0373636 | 0.0050228 |
| text867 | -0.0253325 | -0.0120135 | -0.0412900 | 0.0069376 |
| text868 | -0.0203687 | -0.0101152 | -0.0576526 | 0.0082356 |
| text869 | -0.0226747 | -0.0316318 | -0.1001747 | 0.0159847 |
| text870 | -0.0196377 | -0.0059242 | -0.0626730 | 0.0038382 |
| text871 | -0.0297333 | -0.0301460 | -0.0409697 | 0.0118126 |
| text872 | -0.0302028 | -0.0168988 | -0.0651069 | 0.0056686 |
| text873 | -0.0322966 | -0.0243903 | -0.0610873 | 0.0148815 |
| text874 | -0.0145459 | -0.0113007 | -0.0248381 | 0.0089642 |
| text875 | -0.0309109 | -0.0293697 | -0.0822278 | 0.0110035 |
| text876 | -0.0241582 | -0.0191530 | -0.0696316 | 0.0125359 |
| text877 | -0.0126325 | -0.0149790 | -0.0388443 | 0.0049787 |
| text878 | -0.0427579 | -0.0473676 | -0.0995944 | 0.0038078 |
| text879 | -0.0073875 | -0.0089186 | -0.0100753 | 0.0004033 |
| text880 | -0.0391150 | -0.0426468 | -0.0399896 | -0.0033735 |
| text881 | -0.0343836 | -0.0253447 | -0.0049330 | 0.0007891 |
| text882 | -0.0339413 | -0.0335972 | -0.0209121 | -0.0027975 |
| text883 | -0.0064514 | -0.0007839 | -0.0141162 | 0.0017345 |
| text884 | -0.0171627 | -0.0181548 | -0.0269702 | 0.0060262 |
| text885 | -0.0206666 | -0.0110379 | -0.0263627 | 0.0074540 |
| text886 | -0.0206734 | -0.0183648 | -0.0119628 | -0.0060703 |
| text887 | -0.0207066 | -0.0155331 | -0.0345383 | -0.0054519 |
| text888 | -0.0101435 | -0.0060456 | -0.0096442 | -0.0018103 |
| text889 | -0.0300701 | -0.0211029 | -0.0645018 | -0.0021909 |
| text890 | -0.0399696 | -0.0057900 | -0.0440921 | -0.0118256 |
| text891 | -0.0277793 | -0.0245911 | -0.0987672 | -0.0016017 |
| text892 | -0.0011158 | -0.0010844 | -0.0019564 | 0.0010115 |
| text893 | -0.0205613 | -0.0096583 | -0.0419812 | 0.0079625 |
| text894 | -0.0365594 | -0.0062950 | -0.0871302 | 0.0303236 |
| text895 | -0.0264880 | 0.0031134 | -0.0430303 | 0.0074945 |
| text896 | -0.0141952 | -0.0036915 | -0.0282469 | 0.0078085 |
| text897 | -0.0172569 | -0.0019964 | -0.0454974 | -0.0002566 |
| text898 | -0.0055544 | 0.0021346 | -0.0059836 | -0.0015388 |
| text899 | -0.0184882 | -0.0096826 | -0.0270811 | 0.0053318 |
| text900 | -0.0188169 | -0.0077380 | -0.0140337 | -0.0041507 |
| text901 | -0.0181958 | -0.0091021 | 0.0038981 | 0.0020740 |
| text902 | -0.0097807 | -0.0039097 | -0.0099177 | 0.0081414 |
| text903 | -0.0187350 | -0.0160944 | -0.0280648 | 0.0017192 |
| text904 | -0.0081148 | -0.0090510 | -0.0080471 | 0.0001756 |
| text905 | -0.0268411 | -0.0109000 | -0.0310650 | 0.0104022 |
| text906 | -0.0204626 | 0.0009244 | -0.0479797 | 0.0121036 |
| text907 | -0.0147972 | -0.0073699 | -0.0321348 | 0.0084195 |
| text908 | -0.0218845 | -0.0173402 | -0.0688406 | 0.0107036 |
| text909 | -0.0181737 | -0.0094903 | -0.0376972 | 0.0171485 |
| text910 | -0.0082768 | -0.0086804 | -0.0095795 | 0.0003989 |
| text911 | -0.0168424 | -0.0211593 | -0.0773830 | 0.0061084 |
| text912 | -0.0316961 | -0.0133290 | -0.0724139 | 0.0010857 |
| text913 | -0.0328313 | -0.0144705 | -0.1040172 | 0.0002111 |
| text914 | -0.0191959 | -0.0116610 | -0.0338254 | 0.0082706 |
| text915 | -0.0205816 | -0.0171793 | -0.0395825 | 0.0028247 |
| text916 | -0.0187853 | -0.0276759 | -0.0228502 | -0.0058773 |
| text917 | -0.0347473 | -0.0318730 | -0.0668758 | 0.0009507 |
| text918 | -0.0271167 | -0.0192885 | -0.0291921 | 0.0047020 |
| text919 | -0.0166349 | -0.0151023 | -0.0397307 | 0.0035898 |
| text920 | -0.0282612 | -0.0158691 | -0.0662775 | 0.0045525 |
| text921 | -0.0251312 | -0.0090074 | -0.0134323 | -0.0034945 |
| text922 | -0.0186395 | -0.0251137 | -0.0268148 | 0.0013134 |
| text923 | -0.0165379 | -0.0181290 | -0.0252623 | 0.0098393 |
| text924 | -0.0200581 | -0.0191799 | -0.0660759 | 0.0034478 |
| text925 | -0.0355532 | -0.0418502 | -0.0485124 | -0.0129489 |
| text926 | -0.0040197 | -0.0037470 | 0.0027846 | -0.0021060 |
| text927 | -0.0347473 | -0.0318730 | -0.0668758 | 0.0009507 |
| text928 | -0.0271167 | -0.0192885 | -0.0291921 | 0.0047020 |
| text929 | -0.0166349 | -0.0151023 | -0.0397307 | 0.0035898 |
| text930 | -0.0282612 | -0.0158691 | -0.0662775 | 0.0045525 |
| text931 | -0.0251312 | -0.0090074 | -0.0134323 | -0.0034945 |
| text932 | -0.0186395 | -0.0251137 | -0.0268148 | 0.0013134 |
| text933 | -0.0253364 | -0.0179823 | -0.0756588 | 0.0075839 |
| text934 | -0.0413238 | -0.0450061 | -0.1064322 | -0.0008177 |
| text935 | -0.0053250 | -0.0028966 | -0.0203556 | 0.0026190 |
| text936 | -0.0246221 | -0.0104997 | -0.0817678 | 0.0148945 |
| text937 | -0.0156260 | -0.0129190 | -0.0605708 | 0.0088637 |
| text938 | -0.0168424 | -0.0211593 | -0.0773830 | 0.0061084 |
| text939 | -0.0316961 | -0.0133290 | -0.0724139 | 0.0010857 |
| text940 | -0.0328313 | -0.0144705 | -0.1040172 | 0.0002111 |
| text941 | -0.0191959 | -0.0116610 | -0.0338254 | 0.0082706 |
| text942 | -0.0205816 | -0.0171793 | -0.0395825 | 0.0028247 |
| text943 | -0.0187853 | -0.0276759 | -0.0228502 | -0.0058773 |
| text944 | -0.0171627 | -0.0181548 | -0.0269702 | 0.0060262 |
| text945 | -0.0206666 | -0.0110379 | -0.0263627 | 0.0074540 |
| text946 | -0.0206734 | -0.0183648 | -0.0119628 | -0.0060703 |
| text947 | -0.0207066 | -0.0155331 | -0.0345383 | -0.0054519 |
| text948 | -0.0101435 | -0.0060456 | -0.0096442 | -0.0018103 |
| text949 | -0.0285206 | -0.0122887 | -0.0358505 | 0.0094772 |
| text950 | -0.0214521 | 0.0040255 | -0.0058889 | 0.0071715 |
| text951 | -0.0208959 | -0.0035102 | -0.0152927 | 0.0147653 |
| text952 | -0.0196488 | -0.0044791 | -0.0528934 | 0.0102331 |
| text953 | -0.0182883 | -0.0200285 | -0.0694057 | 0.0123408 |
| text954 | -0.0249253 | -0.0085024 | -0.0432266 | 0.0114850 |
| text955 | -0.0229805 | -0.0031223 | -0.0333391 | 0.0068808 |
| text956 | -0.0064666 | -0.0012948 | -0.0058669 | 0.0005619 |
| text957 | -0.0163111 | -0.0115031 | -0.0315518 | 0.0104993 |
| text958 | -0.0132041 | 0.0030236 | -0.0188222 | -0.0006064 |
| text959 | -0.0210130 | -0.0174595 | 0.0083811 | -0.0008428 |
| text960 | -0.0167568 | -0.0057841 | -0.0123284 | 0.0059048 |
| text961 | -0.0304109 | -0.0130437 | -0.0268415 | -0.0021811 |
| text962 | -0.0219343 | -0.0092149 | -0.0491761 | 0.0046681 |
| text963 | -0.0123540 | 0.0012404 | -0.0062043 | -0.0045109 |
| text964 | -0.0238948 | -0.0069086 | -0.0417913 | 0.0146149 |
| text965 | -0.0144630 | -0.0086483 | -0.0331443 | 0.0115331 |
| text966 | -0.0182867 | -0.0021175 | -0.0183361 | 0.0103607 |
| text967 | -0.0086448 | -0.0077075 | -0.0225850 | 0.0049049 |
| text968 | -0.0269748 | -0.0111008 | -0.0206352 | 0.0113117 |
| text969 | -0.0211349 | -0.0096055 | -0.0223207 | 0.0070096 |
| text970 | -0.0255416 | -0.0191659 | -0.0318562 | 0.0126993 |
| text971 | -0.0278116 | -0.0281338 | 0.0239548 | -0.0162269 |
| text972 | -0.0157893 | -0.0150175 | 0.0135764 | -0.0142609 |
| text973 | -0.0080500 | -0.0126419 | 0.0122792 | -0.0053570 |
| text974 | -0.0122513 | -0.0129035 | -0.0008349 | 0.0025687 |
| text975 | -0.0048812 | -0.0051414 | -0.0016636 | 0.0010319 |
| text976 | -0.0427143 | -0.0204899 | 0.0465079 | -0.0210583 |
| text977 | -0.0130586 | 0.0036905 | 0.0157917 | -0.0004372 |
| text978 | -0.0162711 | 0.0052425 | 0.0081006 | 0.0063403 |
| text979 | -0.0101999 | 0.0049494 | 0.0004427 | 0.0046587 |
| text980 | -0.0196596 | -0.0005344 | 0.0119645 | 0.0043269 |
| text981 | -0.0155899 | 0.0072004 | 0.0082128 | 0.0108938 |
| text982 | -0.0200997 | 0.0071630 | 0.0045344 | 0.0068607 |
| text983 | -0.0271222 | 0.0143427 | 0.0186385 | 0.0000522 |
| text984 | -0.0192048 | 0.0129309 | 0.0180712 | 0.0085216 |
| text985 | -0.0427765 | -0.0168914 | 0.0335288 | -0.0144004 |
| text986 | -0.0331293 | -0.0189951 | 0.0252656 | -0.0075622 |
| text987 | -0.0202849 | -0.0139742 | 0.0117392 | -0.0039394 |
| text988 | -0.0327974 | -0.0323124 | 0.0446035 | -0.0295273 |
| text989 | -0.0437016 | -0.0428602 | 0.0263534 | -0.0235386 |
| text990 | -0.0534911 | -0.0516011 | 0.0270781 | -0.0280755 |
| text991 | -0.0347185 | -0.0250749 | 0.0151293 | -0.0200802 |
| text992 | -0.0440171 | -0.0110749 | 0.0302682 | -0.0166246 |
| text993 | -0.0364797 | -0.0366919 | -0.0036104 | -0.0061204 |
| text994 | -0.0008475 | -0.0006209 | -0.0003973 | -0.0005861 |
| text995 | -0.0459199 | 0.0024594 | 0.0040588 | -0.0086394 |
| text996 | -0.0222983 | -0.0088770 | 0.0070632 | 0.0000370 |
| text997 | -0.0199133 | -0.0241320 | 0.0191006 | -0.0060704 |
| text998 | -0.0182670 | -0.0168302 | 0.0203179 | 0.0003837 |
| text999 | -0.0253510 | -0.0202274 | 0.0228139 | 0.0030059 |
| text1000 | -0.0148204 | -0.0153495 | 0.0098272 | -0.0029220 |
| text1001 | -0.0106957 | -0.0087951 | -0.0023483 | -0.0002370 |
| text1002 | -0.0296979 | -0.0133978 | 0.0095543 | -0.0013670 |
| text1003 | -0.0259599 | -0.0261526 | 0.0298540 | -0.0116642 |
| text1004 | -0.0186161 | -0.0162424 | 0.0131340 | -0.0097736 |
| text1005 | -0.0148105 | -0.0101185 | 0.0109881 | -0.0008418 |
| text1006 | -0.0106093 | -0.0049074 | 0.0095107 | -0.0064780 |
| text1007 | -0.0110811 | -0.0039219 | 0.0070118 | 0.0060488 |
| text1008 | -0.0343406 | -0.0395741 | 0.0272883 | -0.0066680 |
| text1009 | -0.0375276 | -0.0482823 | 0.0314108 | -0.0242650 |
| text1010 | -0.0266996 | -0.0178792 | 0.0136309 | -0.0147277 |
| text1011 | -0.0410336 | -0.0407872 | 0.0311239 | -0.0315325 |
| text1012 | -0.0371324 | -0.0030344 | 0.0358897 | -0.0243311 |
| text1013 | -0.0200584 | -0.0184960 | 0.0195454 | -0.0137290 |
| text1014 | -0.0167595 | -0.0216177 | 0.0154511 | -0.0112841 |
| text1015 | -0.0274121 | -0.0140946 | 0.0247936 | -0.0024696 |
| text1016 | -0.0288419 | -0.0120511 | 0.0037117 | -0.0030342 |
| text1017 | -0.0244102 | -0.0126251 | 0.0112413 | 0.0014102 |
| text1018 | -0.0107993 | -0.0094465 | 0.0127992 | -0.0014810 |
| text1019 | -0.0236294 | -0.0240361 | 0.0179999 | -0.0007793 |
| text1020 | -0.0170284 | -0.0071229 | 0.0070251 | -0.0015913 |
| text1021 | -0.0173950 | -0.0176527 | 0.0052537 | -0.0012543 |
| text1022 | -0.0292867 | -0.0343303 | 0.0129552 | -0.0084455 |
| text1023 | -0.0350570 | -0.0369253 | 0.0152313 | -0.0149223 |
| text1024 | -0.0364354 | -0.0438409 | 0.0512240 | -0.0232841 |
| text1025 | -0.0177159 | -0.0229580 | 0.0322009 | -0.0100148 |
| text1026 | -0.0203631 | -0.0132749 | 0.0246037 | -0.0067218 |
| text1027 | -0.0331856 | -0.0332235 | 0.0393552 | -0.0223275 |
| text1028 | -0.0260436 | -0.0176874 | 0.0018306 | 0.0006090 |
| text1029 | -0.0180035 | -0.0203534 | 0.0166533 | -0.0094436 |
| text1030 | -0.0400319 | -0.0333639 | 0.0379429 | -0.0210069 |
| text1031 | -0.0307489 | -0.0355127 | 0.0503388 | -0.0296360 |
| text1032 | -0.0168494 | -0.0131942 | 0.0079250 | -0.0083329 |
| text1033 | -0.0194875 | -0.0269629 | 0.0282690 | -0.0095469 |
| text1034 | -0.0274669 | -0.0263252 | 0.0430573 | -0.0191661 |
| text1035 | -0.0167799 | -0.0066639 | 0.0103569 | -0.0025784 |
| text1036 | -0.0137971 | -0.0049301 | 0.0068973 | 0.0044464 |
| text1037 | -0.0245749 | -0.0131535 | 0.0236041 | 0.0108875 |
| text1038 | -0.0033231 | -0.0003986 | -0.0001715 | 0.0070108 |
| text1039 | -0.0087714 | -0.0025431 | 0.0012725 | 0.0059852 |
| text1040 | -0.0224213 | -0.0172483 | 0.0199918 | -0.0018369 |
| text1041 | -0.0088398 | -0.0130255 | 0.0203162 | -0.0069832 |
| text1042 | -0.0188657 | -0.0058812 | 0.0105446 | 0.0096387 |
| text1043 | -0.0189454 | -0.0089958 | 0.0192648 | -0.0098962 |
| text1044 | -0.0006962 | -0.0002344 | 0.0005250 | 0.0010196 |
| text1045 | -0.0185020 | -0.0136593 | 0.0113492 | -0.0016568 |
| text1046 | -0.0227613 | -0.0214204 | 0.0186286 | -0.0047883 |
| text1047 | -0.0301263 | -0.0170928 | 0.0191518 | -0.0044547 |
| text1048 | -0.0425327 | -0.0466923 | 0.0558651 | -0.0246546 |
| text1049 | -0.0281420 | -0.0385997 | 0.0440817 | -0.0168740 |
| text1050 | -0.0129889 | -0.0156135 | 0.0144322 | -0.0058495 |
| text1051 | -0.0374100 | -0.0218957 | 0.0215193 | -0.0122758 |
| text1052 | -0.0213398 | -0.0073919 | 0.0073641 | 0.0079071 |
| text1053 | -0.0238665 | -0.0125220 | 0.0106663 | 0.0044977 |
| text1054 | -0.0207800 | -0.0083133 | 0.0141805 | -0.0022193 |
| text1055 | -0.0344362 | -0.0132144 | 0.0210267 | -0.0018803 |
| text1056 | -0.0256428 | -0.0198046 | 0.0112224 | 0.0029558 |
| text1057 | -0.0013663 | -0.0015318 | 0.0000116 | -0.0004548 |
| text1058 | -0.0218616 | -0.0056239 | 0.0030580 | -0.0071827 |
| text1059 | -0.0135900 | -0.0028847 | -0.0060158 | 0.0053515 |
| text1060 | -0.0199968 | -0.0167813 | 0.0136490 | 0.0014655 |
| text1061 | -0.0145692 | -0.0058229 | 0.0051537 | -0.0038851 |
| text1062 | -0.0133181 | -0.0026586 | 0.0031714 | -0.0008538 |
| text1063 | -0.0147559 | -0.0108545 | 0.0124960 | -0.0064305 |
| text1064 | -0.0169934 | -0.0099646 | 0.0089062 | -0.0028468 |
| text1065 | -0.0110942 | -0.0016542 | 0.0066818 | -0.0031153 |
| text1066 | -0.0301082 | -0.0267304 | 0.0212238 | -0.0123914 |
| text1067 | -0.0357462 | -0.0350905 | 0.0397961 | -0.0236768 |
| text1068 | -0.0191147 | -0.0263595 | 0.0301517 | -0.0148957 |
| text1069 | -0.0140342 | -0.0143280 | 0.0112016 | 0.0004743 |
| text1070 | -0.0330694 | -0.0325035 | 0.0315380 | -0.0155135 |
| text1071 | -0.0193996 | -0.0149025 | 0.0154405 | -0.0014897 |
| text1072 | -0.0314103 | -0.0160499 | 0.0152176 | 0.0043838 |
| text1073 | -0.0191822 | -0.0162118 | 0.0171189 | 0.0011733 |
| text1074 | -0.0061031 | -0.0026677 | 0.0033547 | 0.0038679 |
| text1075 | -0.0149183 | -0.0103062 | 0.0025874 | 0.0012922 |
| text1076 | -0.0225727 | -0.0213056 | 0.0032175 | -0.0009677 |
| text1077 | -0.0127823 | -0.0179147 | 0.0112895 | -0.0038460 |
| text1078 | -0.0242090 | -0.0276832 | 0.0264788 | -0.0101104 |
| text1079 | -0.0102434 | -0.0026500 | 0.0025668 | -0.0011791 |
| text1080 | -0.0280601 | -0.0228671 | 0.0229762 | -0.0070365 |
| text1081 | -0.0211541 | -0.0212542 | 0.0148841 | 0.0006800 |
| text1082 | -0.0116575 | -0.0127642 | 0.0104312 | -0.0035303 |
| text1083 | -0.0127823 | -0.0179147 | 0.0112895 | -0.0038460 |
| text1084 | -0.0242090 | -0.0276832 | 0.0264788 | -0.0101104 |
| text1085 | -0.0102434 | -0.0026500 | 0.0025668 | -0.0011791 |
| text1086 | -0.0280601 | -0.0228671 | 0.0229762 | -0.0070365 |
| text1087 | -0.0211541 | -0.0212542 | 0.0148841 | 0.0006800 |
| text1088 | -0.0116575 | -0.0127642 | 0.0104312 | -0.0035303 |
| text1089 | -0.0120955 | 0.0022138 | 0.0038820 | -0.0049866 |
| text1090 | -0.0339960 | 0.0082598 | 0.0133788 | -0.0034171 |
| text1091 | -0.0230172 | -0.0197519 | 0.0250411 | -0.0169886 |
| text1092 | -0.0215586 | 0.0217395 | 0.0071620 | -0.0090691 |
| text1093 | -0.0199282 | -0.0078247 | 0.0165786 | -0.0127473 |
| text1094 | -0.0260822 | -0.0066576 | 0.0175457 | -0.0118609 |
| text1095 | -0.0075181 | -0.0040152 | 0.0118922 | -0.0060544 |
| text1096 | -0.0135084 | -0.0022289 | 0.0096303 | 0.0012631 |
| text1097 | -0.0252297 | -0.0173488 | 0.0175742 | -0.0049624 |
| text1098 | -0.0180520 | -0.0124605 | 0.0183317 | 0.0008588 |
| text1099 | -0.0122480 | -0.0104971 | 0.0140668 | 0.0003056 |
| text1100 | -0.0169052 | -0.0207213 | 0.0032732 | -0.0024844 |
| text1101 | -0.0086570 | -0.0130601 | 0.0156473 | -0.0037348 |
| text1102 | -0.0185043 | -0.0223589 | 0.0253971 | -0.0097434 |
| text1103 | -0.0207532 | -0.0109373 | 0.0203626 | -0.0072246 |
| text1104 | -0.0385905 | -0.0329470 | 0.0363309 | -0.0260472 |
| text1105 | -0.0271036 | -0.0183017 | 0.0219703 | -0.0084023 |
| text1106 | -0.0347232 | -0.0120711 | 0.0307205 | -0.0250650 |
| text1107 | -0.0244134 | -0.0173481 | 0.0288205 | -0.0189699 |
| text1108 | -0.0338459 | -0.0203859 | 0.0271904 | -0.0037017 |
| text1109 | -0.0265995 | -0.0145913 | 0.0197746 | -0.0048818 |
| text1110 | -0.0243288 | -0.0233355 | 0.0239065 | -0.0101794 |
| text1111 | -0.0134916 | -0.0106406 | 0.0102408 | -0.0012468 |
| text1112 | -0.0285933 | -0.0061588 | 0.0150182 | -0.0143522 |
| text1113 | -0.0276880 | 0.0006969 | 0.0135998 | -0.0137279 |
| text1114 | -0.0293462 | -0.0107327 | 0.0300820 | -0.0158101 |
| text1115 | -0.0429433 | -0.0263429 | 0.0366126 | -0.0240406 |
| text1116 | -0.0414755 | -0.0381183 | 0.0505959 | -0.0126749 |
| text1117 | -0.0197483 | -0.0047595 | 0.0112585 | -0.0032263 |
| text1118 | -0.0339208 | -0.0310401 | 0.0256710 | -0.0104684 |
| text1119 | -0.0227860 | -0.0149656 | 0.0128910 | -0.0041926 |
| text1120 | -0.0186545 | -0.0088624 | 0.0038937 | -0.0014272 |
| text1121 | -0.0280833 | -0.0026894 | 0.0078399 | -0.0081628 |
| text1122 | -0.0406032 | -0.0472599 | 0.0473861 | -0.0378818 |
| text1123 | -0.0230989 | -0.0371119 | 0.0211500 | -0.0219859 |
| text1124 | -0.0268936 | -0.0247678 | 0.0122605 | -0.0156171 |
| text1125 | -0.0290895 | -0.0385634 | 0.0322899 | -0.0194282 |
| text1126 | -0.0331620 | -0.0196299 | 0.0197281 | -0.0138431 |
| text1127 | -0.0246749 | -0.0194719 | 0.0166857 | -0.0112171 |
| text1128 | -0.0260543 | -0.0349584 | 0.0306373 | -0.0265864 |
| text1129 | -0.0201163 | -0.0161950 | 0.0129719 | -0.0090445 |
| text1130 | -0.0115481 | -0.0094217 | -0.0029372 | 0.0068683 |
| text1131 | -0.0178616 | -0.0053795 | -0.0283041 | 0.0119091 |
| text1132 | -0.0099561 | -0.0046184 | -0.0188880 | 0.0047869 |
| text1133 | -0.0204328 | -0.0144405 | 0.0236294 | -0.0034613 |
| text1134 | -0.0253171 | -0.0093082 | 0.0190932 | 0.0010040 |
| text1135 | -0.0243174 | -0.0141616 | 0.0102640 | -0.0117772 |
| text1136 | -0.0279239 | -0.0249203 | 0.0299535 | -0.0075017 |
| text1137 | -0.0182281 | -0.0160537 | 0.0169322 | -0.0062249 |
| text1138 | -0.0094893 | -0.0005915 | 0.0069770 | -0.0037016 |
| text1139 | -0.0244850 | -0.0203092 | 0.0099481 | -0.0068665 |
| text1140 | -0.0195475 | -0.0148625 | 0.0058123 | -0.0022997 |
| text1141 | -0.0185958 | -0.0044579 | 0.0048772 | -0.0014560 |
| text1142 | -0.0152063 | -0.0083257 | 0.0065750 | 0.0034493 |
| text1143 | -0.0280628 | -0.0053093 | 0.0140783 | -0.0073832 |
| text1144 | -0.0306228 | -0.0144019 | 0.0052091 | -0.0060603 |
| text1145 | -0.0060628 | -0.0021983 | 0.0029448 | 0.0013403 |
| text1146 | -0.0223725 | -0.0153755 | 0.0203882 | -0.0101544 |
| text1147 | -0.0260141 | -0.0152868 | 0.0196228 | -0.0105586 |
| text1148 | -0.0352998 | -0.0349879 | 0.0450944 | -0.0290702 |
| text1149 | -0.0219808 | -0.0226549 | 0.0156450 | -0.0041725 |
| text1150 | -0.0280772 | -0.0174864 | 0.0247098 | -0.0211564 |
| text1151 | -0.0129991 | -0.0104736 | 0.0087091 | -0.0044850 |
| text1152 | -0.0323853 | -0.0252641 | 0.0270136 | -0.0129498 |
| text1153 | -0.0337343 | -0.0336434 | 0.0217547 | -0.0156746 |
| text1154 | -0.0307037 | -0.0219175 | 0.0382608 | -0.0194922 |
| text1155 | -0.0139452 | -0.0050602 | 0.0113728 | -0.0088114 |
| text1156 | -0.0078122 | -0.0103981 | 0.0091591 | -0.0068396 |
| text1157 | -0.0204544 | -0.0189424 | 0.0269715 | -0.0040111 |
| text1158 | -0.0194380 | -0.0133049 | 0.0066332 | 0.0040570 |
| text1159 | -0.0140599 | -0.0035991 | 0.0027866 | 0.0088005 |
| text1160 | -0.0125310 | -0.0101652 | 0.0108666 | 0.0044738 |
| text1161 | -0.0050203 | -0.0052008 | 0.0042657 | -0.0006224 |
| text1162 | -0.0222305 | -0.0343566 | 0.0312984 | -0.0137673 |
| text1163 | -0.0221717 | -0.0106390 | 0.0095460 | 0.0035989 |
| text1164 | -0.0165023 | -0.0154854 | 0.0121898 | -0.0010480 |
| text1165 | -0.0189102 | -0.0131573 | 0.0145882 | -0.0083644 |
| text1166 | -0.0209061 | -0.0181060 | 0.0152283 | 0.0015395 |
| text1167 | -0.0132485 | -0.0102836 | 0.0072522 | -0.0019931 |
| text1168 | -0.0253133 | -0.0087489 | 0.0012633 | -0.0055113 |
| text1169 | -0.0263879 | -0.0122597 | 0.0231498 | 0.0036716 |
| text1170 | -0.0135500 | -0.0010902 | 0.0052918 | 0.0023271 |
| text1171 | -0.0307963 | -0.0267395 | 0.0446514 | -0.0326795 |
| text1172 | -0.0206176 | -0.0121215 | 0.0117303 | -0.0004180 |
| text1173 | -0.0111715 | -0.0098135 | 0.0080941 | -0.0052822 |
| text1174 | -0.0253133 | -0.0087489 | 0.0012633 | -0.0055113 |
| text1175 | -0.0263879 | -0.0122597 | 0.0231498 | 0.0036716 |
| text1176 | -0.0135500 | -0.0010902 | 0.0052918 | 0.0023271 |
| text1177 | -0.0307963 | -0.0267395 | 0.0446514 | -0.0326795 |
| text1178 | -0.0206176 | -0.0121215 | 0.0117303 | -0.0004180 |
| text1179 | -0.0111715 | -0.0098135 | 0.0080941 | -0.0052822 |
| text1180 | -0.0338346 | -0.0004313 | 0.0085451 | 0.0042449 |
| text1181 | -0.0175080 | -0.0074185 | 0.0188767 | 0.0004048 |
| text1182 | -0.0202454 | -0.0070456 | 0.0091003 | 0.0057953 |
| text1183 | -0.0155649 | -0.0067177 | 0.0129922 | 0.0030009 |
| text1184 | -0.0245622 | -0.0164076 | 0.0205107 | -0.0060800 |
| text1185 | -0.0272548 | -0.0293760 | 0.0307847 | -0.0016413 |
| text1186 | -0.0180257 | -0.0187436 | 0.0205471 | -0.0041048 |
| text1187 | -0.0289491 | -0.0165774 | 0.0120793 | 0.0021229 |
| text1188 | -0.0117816 | -0.0060870 | 0.0047310 | 0.0067572 |
| text1189 | -0.0289491 | -0.0165774 | 0.0120793 | 0.0021229 |
| text1190 | -0.0117816 | -0.0060870 | 0.0047310 | 0.0067572 |
| text1191 | -0.0220958 | -0.0344722 | 0.0354198 | -0.0219007 |
| text1192 | -0.0135438 | -0.0213612 | 0.0186945 | -0.0151176 |
| text1193 | -0.0155481 | -0.0152616 | 0.0146011 | -0.0042045 |
| text1194 | -0.0303074 | -0.0384828 | 0.0401322 | -0.0238906 |
| text1195 | -0.0178494 | -0.0193421 | 0.0227931 | -0.0089324 |
| text1196 | -0.0258268 | -0.0187733 | 0.0248161 | -0.0026782 |
| text1197 | -0.0125308 | -0.0165875 | 0.0203004 | -0.0128399 |
| text1198 | -0.0189107 | -0.0205081 | 0.0142706 | -0.0080079 |
| text1199 | -0.0453731 | -0.0414709 | 0.0336012 | -0.0086971 |
| text1200 | -0.0236201 | -0.0223427 | 0.0113457 | -0.0039668 |
| text1201 | -0.0355549 | -0.0522757 | 0.0577249 | -0.0257034 |
| text1202 | -0.0100708 | -0.0075089 | 0.0060412 | -0.0023116 |
| text1203 | -0.0200904 | -0.0020782 | 0.0270178 | -0.0162808 |
| text1204 | -0.0115494 | -0.0032823 | 0.0083166 | -0.0061974 |
| text1205 | -0.0158367 | -0.0097220 | 0.0110677 | -0.0041147 |
| text1206 | -0.0101677 | -0.0084394 | 0.0024466 | 0.0003488 |
| text1207 | -0.0068907 | -0.0051421 | 0.0003538 | 0.0034734 |
| text1208 | -0.0171843 | -0.0152322 | 0.0181958 | -0.0074204 |
| text1209 | -0.0109044 | -0.0060279 | 0.0105468 | -0.0008077 |
| text1210 | -0.0128322 | -0.0081037 | 0.0037305 | 0.0037115 |
| text1211 | -0.0143948 | -0.0089540 | 0.0045398 | 0.0032300 |
| text1212 | -0.0020415 | -0.0002103 | -0.0002492 | -0.0002210 |
| text1213 | -0.0108836 | -0.0110430 | 0.0056160 | 0.0015766 |
| text1214 | -0.0125390 | -0.0114208 | 0.0094343 | -0.0020678 |
| text1215 | -0.0087263 | -0.0049167 | 0.0022342 | 0.0018564 |
| text1216 | -0.0257589 | -0.0227024 | 0.0352722 | -0.0137017 |
| text1217 | -0.0408130 | -0.0465370 | 0.0525037 | -0.0341073 |
| text1218 | -0.0370305 | -0.0422787 | 0.0381864 | -0.0178016 |
| text1219 | -0.0232201 | -0.0203957 | 0.0137902 | -0.0119432 |
| text1220 | -0.0308401 | -0.0193076 | 0.0254629 | 0.0021493 |
| text1221 | -0.0084205 | -0.0087235 | 0.0119914 | -0.0032318 |
| text1222 | -0.0320554 | -0.0361506 | 0.0418538 | -0.0307169 |
| text1223 | -0.0105421 | -0.0118383 | 0.0178973 | -0.0135633 |
| text1224 | -0.0237996 | -0.0187240 | 0.0156239 | 0.0000426 |
| text1225 | -0.0289078 | -0.0253283 | 0.0181026 | 0.0008869 |
| text1226 | -0.0193353 | -0.0248096 | 0.0130660 | -0.0009890 |
| text1227 | -0.0234165 | -0.0221570 | 0.0273415 | -0.0029245 |
| text1228 | -0.0292630 | -0.0318912 | 0.0246364 | -0.0046825 |
| text1229 | -0.0185570 | -0.0284533 | 0.0274355 | -0.0115885 |
| text1230 | -0.0253651 | -0.0223118 | 0.0169288 | -0.0058940 |
| text1231 | -0.0313252 | -0.0175566 | 0.0250452 | -0.0021975 |
| text1232 | -0.0087841 | -0.0063179 | 0.0077797 | 0.0010684 |
| text1233 | -0.0313854 | -0.0446784 | 0.0192770 | -0.0115624 |
| text1234 | -0.0269495 | -0.0430686 | 0.0382685 | -0.0222427 |
| text1235 | -0.0348015 | -0.0354458 | 0.0260897 | -0.0047621 |
| text1236 | -0.0225520 | -0.0231180 | 0.0100001 | -0.0063650 |
| text1237 | -0.0245876 | -0.0147993 | 0.0123905 | 0.0037193 |
| text1238 | -0.0194184 | -0.0108746 | 0.0141911 | 0.0048471 |
| text1239 | -0.0049818 | -0.0041798 | -0.0024463 | 0.0011828 |
| text1240 | -0.0158120 | -0.0188296 | 0.0146039 | -0.0038907 |
| text1241 | -0.0183169 | -0.0088038 | 0.0115462 | -0.0022004 |
| text1242 | -0.0223971 | -0.0051785 | 0.0122446 | -0.0084637 |
| text1243 | -0.0159546 | -0.0167132 | 0.0262412 | -0.0131933 |
| text1244 | -0.0132718 | -0.0094181 | 0.0109666 | -0.0048375 |
| text1245 | -0.0018278 | -0.0012224 | 0.0028021 | -0.0017643 |
| text1246 | -0.0232929 | -0.0122365 | 0.0127699 | 0.0017122 |
| text1247 | -0.0206830 | -0.0113211 | 0.0222255 | 0.0026732 |
| text1248 | -0.0309174 | -0.0032887 | 0.0265568 | -0.0166240 |
| text1249 | -0.0133048 | -0.0108564 | 0.0083389 | -0.0015538 |
| text1250 | -0.0310419 | -0.0202752 | 0.0339144 | -0.0172231 |
| text1251 | -0.0154858 | -0.0072497 | 0.0061040 | 0.0004323 |
| text1252 | -0.0144430 | -0.0133811 | 0.0220287 | -0.0015614 |
| text1253 | -0.0136958 | -0.0077369 | 0.0111449 | 0.0063699 |
| text1254 | -0.0175222 | -0.0022748 | 0.0076530 | 0.0006984 |
| text1255 | -0.0132406 | -0.0099549 | 0.0065488 | -0.0053206 |
| text1256 | -0.0122528 | -0.0112212 | 0.0035871 | -0.0006870 |
| text1257 | -0.0213554 | -0.0128326 | 0.0225194 | -0.0129965 |
| text1258 | -0.0262546 | -0.0145498 | 0.0142721 | -0.0074675 |
| text1259 | -0.0242353 | -0.0235571 | 0.0243342 | -0.0041237 |
| text1260 | -0.0105316 | -0.0058008 | 0.0109402 | 0.0017899 |
| text1261 | -0.0121728 | -0.0071155 | 0.0116553 | -0.0029772 |
| text1262 | -0.0000251 | -0.0000279 | 0.0000919 | -0.0001118 |
| text1263 | -0.0377047 | -0.0422378 | 0.0096580 | -0.0202800 |
| text1264 | -0.0197934 | -0.0230440 | 0.0076974 | -0.0092711 |
| text1265 | -0.0144286 | -0.0106025 | 0.0061444 | 0.0033814 |
| text1266 | -0.0248365 | -0.0339680 | 0.0067009 | -0.0114112 |
| text1267 | -0.0368971 | -0.0344407 | 0.0076351 | 0.0021460 |
| text1268 | -0.0322683 | -0.0064628 | 0.0150060 | -0.0064799 |
| text1269 | -0.0227769 | -0.0093380 | 0.0063918 | 0.0022550 |
| text1270 | -0.0350511 | -0.0114296 | 0.0074299 | -0.0099809 |
| text1271 | -0.0397903 | -0.0177816 | 0.0227249 | -0.0164543 |
| text1272 | -0.0319448 | -0.0252019 | 0.0216191 | -0.0023129 |
| text1273 | -0.0297903 | -0.0186123 | 0.0097407 | 0.0041726 |
| text1274 | -0.0244971 | -0.0130057 | 0.0258249 | -0.0035841 |
| text1275 | -0.0419078 | -0.0443082 | 0.0479571 | -0.0216686 |
| text1276 | -0.0084714 | -0.0057896 | 0.0068551 | -0.0022203 |
| text1277 | -0.0215771 | -0.0249815 | 0.0342694 | -0.0059751 |
| text1278 | -0.0189365 | -0.0148563 | 0.0214280 | 0.0025343 |
| text1279 | -0.0343544 | -0.0389256 | 0.0434151 | -0.0129715 |
| text1280 | -0.0201428 | -0.0227046 | 0.0288973 | -0.0093314 |
| text1281 | -0.0371157 | -0.0384304 | 0.0334478 | -0.0281377 |
| text1282 | -0.0244388 | -0.0205729 | 0.0202058 | -0.0146421 |
| text1283 | -0.0263357 | -0.0216308 | 0.0215080 | -0.0146242 |
| text1284 | -0.0319279 | -0.0435679 | 0.0315295 | -0.0161905 |
| text1285 | -0.0295901 | -0.0306222 | 0.0219509 | -0.0114371 |
| text1286 | -0.0118790 | -0.0158113 | 0.0130598 | -0.0045997 |
| text1287 | -0.0198203 | -0.0197079 | -0.0020638 | -0.0019390 |
| text1288 | -0.0370017 | -0.0278708 | 0.0315988 | -0.0161806 |
| text1289 | -0.0205322 | -0.0203491 | 0.0096171 | -0.0062514 |
| text1290 | -0.0171256 | -0.0166705 | 0.0102353 | -0.0074171 |
| text1291 | -0.0186634 | -0.0126053 | 0.0129029 | -0.0022922 |
| text1292 | -0.0197515 | -0.0224775 | 0.0105363 | -0.0041338 |
| text1293 | -0.0143811 | -0.0164611 | 0.0144214 | -0.0066032 |
| text1294 | -0.0274031 | -0.0361216 | 0.0250000 | -0.0106621 |
| text1295 | -0.0173053 | -0.0166382 | 0.0095629 | -0.0007631 |
| text1296 | -0.0218776 | -0.0236391 | 0.0122054 | -0.0064423 |
| text1297 | -0.0212042 | -0.0204180 | 0.0263196 | -0.0051083 |
| text1298 | -0.0281849 | -0.0341458 | 0.0281037 | -0.0118736 |
| text1299 | -0.0324331 | -0.0702302 | 0.0772247 | -0.0432351 |
| text1300 | -0.0203909 | -0.0370155 | 0.0406581 | -0.0232335 |
| text1301 | -0.0331164 | -0.0245485 | 0.0500120 | -0.0235915 |
| text1302 | -0.0297959 | -0.0326719 | 0.0296014 | -0.0163616 |
| text1303 | -0.0256792 | -0.0265764 | 0.0332029 | -0.0160762 |
| text1304 | -0.0364908 | -0.0491971 | 0.0529159 | -0.0310092 |
| text1305 | -0.0263818 | -0.0079193 | 0.0150449 | -0.0082711 |
| text1306 | -0.0371510 | -0.0415714 | 0.0261596 | -0.0207030 |
| text1307 | -0.0192546 | -0.0083204 | 0.0121943 | -0.0030277 |
| text1308 | -0.0123517 | -0.0091194 | 0.0102810 | -0.0004446 |
| text1309 | -0.0165899 | -0.0107063 | -0.0036836 | 0.0096940 |
| text1310 | -0.0170364 | -0.0077991 | 0.0091901 | 0.0045287 |
| text1311 | -0.0294660 | -0.0267418 | 0.0297496 | -0.0026375 |
| text1312 | -0.0179177 | -0.0087022 | 0.0029880 | 0.0060890 |
| text1313 | -0.0229068 | -0.0190110 | 0.0123486 | -0.0007823 |
| text1314 | -0.0197625 | -0.0166414 | 0.0039465 | 0.0011540 |
| text1315 | -0.0047663 | -0.0047004 | 0.0031627 | -0.0015755 |
| text1316 | -0.0175584 | -0.0150533 | 0.0074115 | 0.0026292 |
| text1317 | -0.0302054 | -0.0285164 | 0.0410343 | -0.0119864 |
| text1318 | -0.0129722 | -0.0127727 | 0.0142757 | -0.0069361 |
| text1319 | -0.0141781 | -0.0106203 | 0.0207878 | 0.0000727 |
| text1320 | -0.0191252 | -0.0124387 | 0.0140555 | -0.0016036 |
| text1321 | -0.0043648 | -0.0032402 | 0.0025957 | -0.0027211 |
| text1322 | -0.0212011 | -0.0177168 | 0.0119843 | -0.0025866 |
| text1323 | -0.0327393 | -0.0354460 | 0.0288433 | -0.0126210 |
| text1324 | -0.0330357 | -0.0262942 | 0.0212871 | -0.0113363 |
| text1325 | -0.0130956 | -0.0116058 | 0.0044406 | -0.0063922 |
| text1326 | -0.0262683 | -0.0322135 | 0.0100353 | -0.0179529 |
| text1327 | -0.0049794 | -0.0052195 | -0.0029593 | -0.0014452 |
| text1328 | -0.0214682 | -0.0179034 | 0.0145458 | 0.0021978 |
| text1329 | -0.0193344 | -0.0191635 | 0.0221284 | -0.0038298 |
| text1330 | -0.0418091 | -0.0200834 | 0.0200841 | 0.0040169 |
| text1331 | -0.0420066 | -0.0592393 | 0.0589938 | -0.0366837 |
| text1332 | -0.0283126 | -0.0408920 | 0.0320490 | -0.0220936 |
| text1333 | -0.0322400 | -0.0275556 | 0.0088936 | 0.0025147 |
| text1334 | -0.0069879 | 0.0066040 | -0.0026317 | -0.0019740 |
| text1335 | -0.0159419 | 0.0013483 | -0.0028675 | -0.0026103 |
| text1336 | -0.0106301 | -0.0068354 | -0.0061622 | 0.0076143 |
| text1337 | -0.0330354 | -0.0119123 | 0.0132233 | 0.0024219 |
| text1338 | -0.0229965 | -0.0099492 | 0.0163584 | 0.0101580 |
| text1339 | -0.0040152 | -0.0052168 | 0.0041046 | -0.0010303 |
| text1340 | -0.0215651 | -0.0228215 | 0.0162005 | -0.0001731 |
| text1341 | -0.0085816 | -0.0081415 | -0.0014404 | 0.0019583 |
| text1342 | -0.0304100 | -0.0245165 | 0.0194532 | -0.0120615 |
| text1343 | -0.0078934 | 0.0005730 | -0.0087319 | -0.0010338 |
| text1344 | -0.0175647 | -0.0056119 | 0.0069379 | 0.0053781 |
| text1345 | -0.0112575 | -0.0036575 | 0.0041023 | 0.0032193 |
| text1346 | -0.0083523 | -0.0049881 | -0.0003655 | 0.0023757 |
| text1347 | -0.0050436 | 0.0058937 | -0.0007687 | -0.0013930 |
| text1348 | -0.0199133 | -0.0241320 | 0.0191006 | -0.0060704 |
| text1349 | -0.0182670 | -0.0168302 | 0.0203179 | 0.0003837 |
| text1350 | -0.0253510 | -0.0202274 | 0.0228139 | 0.0030059 |
| text1351 | -0.0148204 | -0.0153495 | 0.0098272 | -0.0029220 |
| text1352 | -0.0106957 | -0.0087951 | -0.0023483 | -0.0002370 |
| text1353 | -0.0236027 | -0.0180741 | 0.0200338 | 0.0004266 |
| text1354 | -0.0123879 | -0.0023628 | 0.0025544 | 0.0048047 |
| text1355 | -0.0127741 | -0.0071121 | 0.0117839 | -0.0072104 |
| text1356 | -0.0157944 | -0.0188414 | 0.0238667 | -0.0031225 |
| text1357 | -0.0138435 | -0.0169481 | 0.0225248 | -0.0061239 |
| text1358 | -0.0374851 | -0.0345558 | 0.0147089 | -0.0104329 |
| text1359 | -0.0156332 | -0.0173687 | 0.0157866 | -0.0090199 |
| text1360 | -0.0238672 | -0.0309938 | 0.0184098 | -0.0136054 |
| text1361 | -0.0278060 | -0.0294771 | 0.0018664 | -0.0108769 |
| text1362 | -0.0108054 | -0.0167107 | 0.0105743 | -0.0070844 |
| text1363 | -0.0162729 | -0.0149582 | 0.0149370 | -0.0052254 |
| text1364 | -0.0169485 | -0.0082989 | 0.0037341 | 0.0063032 |
| text1365 | -0.0255821 | -0.0259972 | 0.0138923 | -0.0039673 |
| text1366 | -0.0139558 | -0.0141919 | 0.0110847 | 0.0002414 |
| text1367 | -0.0172765 | -0.0175659 | 0.0194921 | -0.0102851 |
| text1368 | -0.0108898 | -0.0188032 | 0.0233461 | -0.0085326 |
| text1369 | -0.0046710 | -0.0079571 | 0.0113909 | -0.0039857 |
| text1370 | -0.0270755 | -0.0427760 | 0.0632609 | -0.0284076 |
| text1371 | -0.0209777 | -0.0115480 | 0.0271077 | -0.0166764 |
| text1372 | -0.0083622 | -0.0129381 | 0.0155152 | -0.0115692 |
| text1373 | -0.0324820 | -0.0141705 | 0.0094045 | 0.0039952 |
| text1374 | -0.0179198 | -0.0124267 | 0.0135126 | -0.0032875 |
| text1375 | -0.0250306 | -0.0240338 | 0.0201811 | -0.0018895 |
| text1376 | -0.0294281 | -0.0239447 | 0.0234034 | 0.0012512 |
| text1377 | -0.0296595 | -0.0149764 | 0.0203755 | -0.0007243 |
| text1378 | -0.0011634 | -0.0007094 | 0.0001343 | -0.0004430 |
| text1379 | -0.0210614 | -0.0186492 | 0.0166222 | 0.0067423 |
| text1380 | -0.0192499 | -0.0114692 | 0.0027979 | 0.0026075 |
| text1381 | -0.0263224 | -0.0130773 | 0.0024896 | 0.0070165 |
| text1382 | -0.0282173 | -0.0255885 | 0.0290016 | -0.0041299 |
| text1383 | -0.0090403 | -0.0068360 | 0.0050652 | -0.0003802 |
| text1384 | -0.0160612 | -0.0135646 | 0.0094146 | 0.0046550 |
| text1385 | -0.0224823 | -0.0084191 | 0.0065262 | 0.0005661 |
| text1386 | -0.0252930 | -0.0191878 | 0.0156297 | -0.0000676 |
| text1387 | -0.0200779 | -0.0188203 | 0.0127996 | 0.0014429 |
| text1388 | -0.0209746 | -0.0158269 | 0.0078984 | 0.0069744 |
| text1389 | -0.0180291 | -0.0128333 | 0.0005124 | 0.0053052 |
| text1390 | -0.0121053 | -0.0066820 | 0.0057504 | 0.0020275 |
| text1391 | -0.0453184 | -0.0289202 | 0.0314715 | -0.0259590 |
| text1392 | -0.0253356 | -0.0356443 | 0.0355079 | -0.0215467 |
| text1393 | -0.0300863 | -0.0329645 | 0.0431182 | -0.0164955 |
| text1394 | -0.0346267 | -0.0497580 | 0.0621428 | -0.0218888 |
| text1395 | -0.0302935 | -0.0337548 | 0.0352059 | -0.0144010 |
| text1396 | -0.0280730 | -0.0201772 | 0.0277154 | 0.0019051 |
| text1397 | -0.0169138 | -0.0077611 | 0.0253478 | 0.0008448 |
| text1398 | -0.0239955 | -0.0158739 | 0.0318515 | 0.0005385 |
| text1399 | -0.0182385 | -0.0123515 | 0.0161897 | 0.0026435 |
| text1400 | -0.0149582 | -0.0085011 | 0.0089844 | 0.0003615 |
| text1401 | -0.0189402 | -0.0060880 | 0.0000486 | 0.0036233 |
| text1402 | -0.0134178 | -0.0045263 | 0.0080436 | -0.0004206 |
| text1403 | -0.0101801 | -0.0050896 | 0.0116766 | 0.0085836 |
| text1404 | -0.0313735 | -0.0105271 | 0.0207052 | 0.0106364 |
| text1405 | -0.0255722 | -0.0175560 | 0.0224601 | -0.0073196 |
| text1406 | -0.0245900 | -0.0090380 | 0.0043929 | 0.0093472 |
| text1407 | -0.0221980 | -0.0192938 | 0.0212631 | -0.0008437 |
| text1408 | -0.0251982 | -0.0240122 | 0.0420521 | -0.0261952 |
| text1409 | -0.0203148 | -0.0332502 | 0.0439579 | -0.0239312 |
| text1410 | -0.0159374 | -0.0197010 | 0.0208139 | -0.0130146 |
| text1411 | -0.0424525 | -0.0414119 | 0.0613792 | -0.0212083 |
| text1412 | -0.0186321 | -0.0237309 | 0.0287905 | -0.0012285 |
| text1413 | -0.0319115 | -0.0350255 | 0.0422151 | -0.0165795 |
| text1414 | -0.0179175 | -0.0153722 | 0.0222230 | -0.0108068 |
| text1415 | -0.0319043 | -0.0165855 | 0.0289279 | -0.0061888 |
| text1416 | -0.0405396 | 0.1034778 | -0.0073124 | -0.0888108 |
| text1417 | -0.0283061 | 0.0478516 | 0.0110753 | -0.0490471 |
| text1418 | -0.0429285 | 0.0910602 | 0.0104452 | -0.0723139 |
| text1419 | -0.0366631 | 0.0706994 | -0.0092534 | -0.0669935 |
| text1420 | -0.0100288 | 0.0308001 | -0.0001222 | -0.0126220 |
| text1421 | -0.0405396 | 0.1034778 | -0.0073124 | -0.0888108 |
| text1422 | -0.0283061 | 0.0478516 | 0.0110753 | -0.0490471 |
| text1423 | -0.0429285 | 0.0910602 | 0.0104452 | -0.0723139 |
| text1424 | -0.0366631 | 0.0706994 | -0.0092534 | -0.0669935 |
| text1425 | -0.0100288 | 0.0308001 | -0.0001222 | -0.0126220 |
| text1426 | -0.0138540 | -0.0064632 | 0.0021442 | -0.0032020 |
| text1427 | -0.0187671 | -0.0166677 | 0.0052083 | 0.0048837 |
| text1428 | -0.0188015 | -0.0161834 | 0.0062781 | -0.0054609 |
| text1429 | -0.0128209 | -0.0171851 | 0.0146064 | -0.0113797 |
| text1430 | -0.0263307 | -0.0275192 | 0.0145038 | -0.0019497 |
| text1431 | -0.0258309 | -0.0305303 | 0.0226093 | -0.0095027 |
| text1432 | -0.0229858 | -0.0188972 | 0.0228609 | -0.0017898 |
| text1433 | -0.0085748 | -0.0095649 | 0.0088374 | -0.0038393 |
| text1434 | -0.0335303 | -0.0160602 | 0.0127772 | 0.0113703 |
| text1435 | -0.0155297 | -0.0107261 | 0.0137074 | 0.0049370 |
| text1436 | -0.0229171 | -0.0300761 | 0.0207464 | -0.0097378 |
| text1437 | -0.0241654 | -0.0261816 | 0.0153819 | -0.0017239 |
| text1438 | -0.0489894 | -0.0481185 | 0.0289188 | -0.0169174 |
| text1439 | -0.0250466 | -0.0154327 | 0.0117028 | -0.0065938 |
| text1440 | -0.0520046 | -0.0583742 | 0.0662490 | -0.0504839 |
| text1441 | -0.0334258 | -0.0393509 | 0.0371494 | -0.0235328 |
| text1442 | -0.0327471 | -0.0393492 | 0.0406206 | -0.0202905 |
| text1443 | -0.0277549 | -0.0257793 | 0.0298059 | -0.0185247 |
| text1444 | -0.0269057 | -0.0249537 | 0.0342853 | -0.0057383 |
| text1445 | -0.0398505 | -0.0400880 | 0.0362299 | -0.0203891 |
| text1446 | -0.0322838 | -0.0288647 | 0.0217970 | -0.0087234 |
| text1447 | -0.0168690 | -0.0186688 | 0.0209756 | -0.0132781 |
| text1448 | -0.0172030 | -0.0046078 | 0.0008116 | 0.0049950 |
| text1449 | -0.0207689 | -0.0115445 | 0.0116389 | 0.0012889 |
| text1450 | -0.0146050 | -0.0106660 | 0.0073105 | 0.0028472 |
| text1451 | -0.0090539 | -0.0066361 | 0.0060523 | -0.0021878 |
| text1452 | -0.0054197 | 0.0031217 | 0.0024780 | -0.0019646 |
| text1453 | -0.0225591 | -0.0431354 | 0.0494379 | -0.0250713 |
| text1454 | -0.0398156 | -0.0359628 | 0.0492129 | -0.0060549 |
| text1455 | -0.0111392 | -0.0185834 | 0.0233839 | -0.0095273 |
| text1456 | -0.0283824 | -0.0605737 | 0.0721800 | -0.0341658 |
| text1457 | -0.0048972 | -0.0133059 | 0.0162561 | -0.0070454 |
| text1458 | -0.0280098 | -0.0185220 | 0.0092738 | 0.0040364 |
| text1459 | -0.0258793 | -0.0131391 | 0.0180048 | 0.0078444 |
| text1460 | -0.0222697 | -0.0190342 | 0.0231783 | -0.0136937 |
| text1461 | -0.0311645 | -0.0079515 | 0.0039554 | -0.0004442 |
| text1462 | -0.0254623 | -0.0215802 | 0.0097058 | -0.0054261 |
| text1463 | -0.0065563 | -0.0029828 | -0.0001848 | 0.0011721 |
| text1464 | -0.0122730 | -0.0092948 | 0.0032985 | -0.0007902 |
| text1465 | -0.0398933 | -0.0240134 | 0.0226183 | 0.0076573 |
| text1466 | -0.0182342 | -0.0106780 | 0.0117423 | -0.0024338 |
| text1467 | -0.0296004 | -0.0191827 | 0.0214830 | -0.0001373 |
| text1468 | -0.0717713 | -0.0122703 | 0.0406497 | -0.0389009 |
| text1469 | -0.0048696 | -0.0011797 | 0.0014782 | -0.0006533 |
| text1470 | -0.0268300 | -0.0175015 | 0.0134075 | -0.0086325 |
| text1471 | -0.0117531 | -0.0020362 | 0.0015143 | 0.0032228 |
The Topic strength table below represents the strength of each topic.
CN <- c("dimension1","dimension2","dimension3","dimension4")
Topic_Strength <- data.frame(CN,TED.lsa$sk)
kable(Topic_Strength,
col.names= c("Dimension","Topic strength"),
caption = "Topic strength(LSA on TF)")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| Dimension | Topic strength |
|---|---|
| dimension1 | 183.84893 |
| dimension2 | 85.97943 |
| dimension3 | 81.27583 |
| dimension4 | 73.82645 |
The Terms-topic sim. table below shows the link between each term and each topic. For example, the term “artificial” most relevant to dimension 2.
kable(head(TED.lsa$features,10),
col.names = c("dimension1","dimension2","dimension3","dimension4"),
caption = "Terms-topic sim.(LSA on TF)") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| dimension1 | dimension2 | dimension3 | dimension4 | |
|---|---|---|---|---|
| today | -0.0569966 | 0.0126785 | -0.0485503 | -0.0155983 |
| artificial | -0.0303165 | 0.0683780 | -0.0026655 | -0.0039735 |
| intelligence | -0.0516758 | 0.1320332 | 0.0036350 | -0.0102737 |
| help | -0.0253760 | 0.0020966 | 0.0032362 | -0.0121982 |
| doctor | -0.0138459 | 0.0036630 | 0.0071495 | -0.0041132 |
| diagnose | -0.0058367 | 0.0095565 | 0.0023165 | -0.0030806 |
| patient | -0.0130458 | 0.0135236 | 0.0042128 | -0.0092575 |
| pilot | -0.0023414 | 0.0010596 | -0.0031627 | 0.0018573 |
| fly | -0.0104628 | 0.0055722 | -0.0042352 | 0.0232152 |
| commercial | -0.0024783 | 0.0036116 | -0.0043273 | -0.0011694 |
The first dimension of LSA is often correlated with the document length and the frequency of the term. This phenomenon can be visualized through the construction of a scatter plot between the document length and the first dimension of the latent semantic space.
doc.freq <- ntoken(TED.tk) # row-sum of the DTM.
data.frame(doc.freq,
dim1 = TED.lsa$docs[, 1]) %>%
ggplot(aes(doc.freq, dim1)) +
geom_point() +
geom_smooth(method="lm",
formula = 'y ~ x') +
labs(
title="The relationship between the number of tokens in documents and the values in LSA dimension 1",
x="Number of tokens",
y="LSA dim. 1"
)
We then examined the top words in dimension 2, 3, and 4. For each dimension, we look at the five terms with the largest values and the five ones with the lowest value (i.e., largest negative value).
According to the table below, Dimension 2 is associated positively with word like “ai”, “human”, “robot”, “machine” ,“datum”, and negatively associated with “feel”, “climate”, “life”, “love”, “people”.
n.terms <- 5
## For Dimension 2
w.order <- sort(TED.lsa$features[, 2],decreasing = TRUE)
w.top.d2 <- c(w.order[1:n.terms],rev(rev(w.order)[1:n.terms]))
## For Dimension 3
w.order <- sort(TED.lsa$features[, 3], decreasing = TRUE)
w.top.d3 <- c(w.order[1:n.terms], rev(rev(w.order)[1:n.terms]))
## For Dimension 4
w.order <- sort(TED.lsa$features[,4], decreasing = TRUE)
w.top.d4 <- c(w.order[1:n.terms], rev(rev(w.order)[1:n.terms]))
kable(w.top.d2,
col.names = "value",
caption = "Dimension 2(LSA on TF)")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| value | |
|---|---|
| ai | 0.5115474 |
| human | 0.3948890 |
| robot | 0.1953350 |
| machine | 0.1781869 |
| datum | 0.1514245 |
| feel | -0.1080379 |
| climate | -0.1133600 |
| life | -0.1259553 |
| love | -0.2119132 |
| people | -0.2781561 |
The below table shows that Dimension 3 is associated positively with word like “people”, “love”, “robot”, “fell” ,“life”, and negatively associated with “forest”, “year”, “energy”, “carbon”, “climate”.
kable(w.top.d3,
col.names = "value",
caption = "Dimension 3(LSA on TF)")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| value | |
|---|---|
| people | 0.2709439 |
| love | 0.2628304 |
| robot | 0.1889753 |
| feel | 0.1307637 |
| life | 0.1043617 |
| forest | -0.1520730 |
| year | -0.1816735 |
| energy | -0.1888159 |
| carbon | -0.1949385 |
| climate | -0.2867198 |
The below table shows that Dimension 4 is associated positively with word like “robot”, “thing”, “rule”, “move” ,“start”, and negatively associated with “datum”, “human”, “love”, “people”, “ai”.
kable(w.top.d4,
col.names = "value",
caption = "Dimension 4(LSA on TF)")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| value | |
|---|---|
| robot | 0.7714658 |
| thing | 0.1265223 |
| rule | 0.1009858 |
| move | 0.0932420 |
| start | 0.0730760 |
| datum | -0.0920711 |
| human | -0.1281984 |
| love | -0.1326988 |
| people | -0.1940262 |
| ai | -0.3584263 |
In order to check the relation between LSA and category of text, we combine the LSA result with the category of document and represent every text on these two following plots.
TED.lsa.source <- TED_full %>%
select(2) %>% cbind(as.data.frame(TED.lsa$docs))
LSA_p1 <- ggplot(data=TED.lsa.source,mapping = aes(
x=V2,
y=V3,
color=cate))+
geom_point()+
labs(x = "dimension2",
y = "dimension3",
title = "Distribution of texts in different category",
subtitle = "LSA(TF) dimension 2 and 3")+
scale_colour_discrete(
name="Category",
breaks=c("1","2","3"),
labels=c("AI","Climate change","Relationships")
)+
theme(plot.title = element_text(size = 12))
LSA_P2 <- ggplot(data=TED.lsa.source,mapping = aes(
x=V3,
y=V4,
color=cate))+
geom_point()+
labs(x = "dimension3",
y = "dimension4",
title = "Distribution of texts in different category",
subtitle = "LSA(TF):dimension 3 and 4")+
scale_colour_discrete(
name="Category",
breaks=c("1","2","3"),
labels=c("AI","Climate change","Relationships")
)+
theme(plot.title = element_text(size = 12))
(LSA_p1+LSA_P2)+
plot_layout(guides = "collect") & theme(legend.position = 'bottom')
Left plot:x-axis is dimension 2 and y-axis is dimension3. According to this plot, most of the texts of the “Climate change” category are negatively associated with dimension3. Most of the texts of the “Relationships” category are positively associated with dimension3. And most of the category “AI” are positively associated with dimension2.
Right plot:x-axis is dimension 3 and y-axis is dimension4. According to this plot, most texts of the “AI” category are positively associated with dimension4. The “Climate change” and the “Relationships” categories seem to be not associated with dimension4.
Repeat the LSA with the TF-IDF as DTM. Check whether the weighted frequency can make the LSA results better interpret texts.
TED.lsa2 <- textmodel_lsa(TED.tfidf, nd = 4)
kable(TED.lsa2$docs,
col.names = c("dimension1","dimension2","dimension3","dimension4"),
caption = "Doc-topic sim.(LSA on TF-IDF)") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| dimension1 | dimension2 | dimension3 | dimension4 | |
|---|---|---|---|---|
| text1 | -0.0422496 | -0.0329279 | -0.0320857 | -0.0394412 |
| text2 | -0.0275697 | -0.0224974 | -0.0249187 | -0.0308737 |
| text3 | -0.0352744 | -0.0394390 | -0.0573503 | -0.0317769 |
| text4 | -0.0387940 | -0.0440947 | -0.0690456 | -0.0479835 |
| text5 | -0.0425726 | -0.0337124 | -0.0456903 | -0.0725318 |
| text6 | -0.0213413 | -0.0274721 | -0.0331742 | -0.0599822 |
| text7 | -0.0372246 | -0.0300740 | -0.0267797 | -0.0323277 |
| text8 | -0.0367941 | -0.0212970 | -0.0162914 | -0.0250740 |
| text9 | -0.0363915 | -0.0263333 | -0.0233207 | -0.0359802 |
| text10 | -0.0308465 | -0.0253658 | -0.0140726 | -0.0261814 |
| text11 | -0.0402360 | -0.0229583 | -0.0268662 | -0.0377900 |
| text12 | -0.0375956 | -0.0272690 | -0.0224744 | -0.0247567 |
| text13 | -0.0245463 | -0.0025653 | -0.0091655 | -0.0169218 |
| text14 | -0.0252603 | -0.0124116 | -0.0122370 | -0.0152749 |
| text15 | -0.0001732 | -0.0000771 | 0.0001284 | 0.0000621 |
| text16 | -0.0308109 | -0.0184768 | -0.0131101 | -0.0031044 |
| text17 | -0.0214623 | -0.0090715 | 0.0034872 | -0.0065740 |
| text18 | -0.0271418 | -0.0101689 | 0.0100212 | 0.0053525 |
| text19 | -0.0211285 | -0.0092654 | 0.0079649 | -0.0076075 |
| text20 | -0.0156512 | -0.0071335 | 0.0095319 | -0.0009809 |
| text21 | -0.0174472 | -0.0060270 | 0.0004573 | -0.0039779 |
| text22 | -0.0246188 | -0.0163992 | -0.0185785 | -0.0164401 |
| text23 | -0.0247240 | -0.0173729 | -0.0138138 | -0.0114032 |
| text24 | -0.0288989 | -0.0253290 | -0.0306215 | 0.0164976 |
| text25 | -0.0417036 | -0.0440674 | -0.0693023 | 0.0400973 |
| text26 | -0.0326873 | -0.0129338 | -0.0241456 | -0.0097452 |
| text27 | -0.0025679 | -0.0001549 | 0.0000837 | -0.0003927 |
| text28 | -0.0220737 | -0.0109435 | -0.0171714 | -0.0223362 |
| text29 | -0.0323942 | -0.0324238 | -0.0258192 | -0.0386713 |
| text30 | -0.0328217 | 0.0069878 | -0.0199073 | -0.0178160 |
| text31 | -0.0254942 | -0.0209568 | -0.0458023 | -0.0111945 |
| text32 | -0.0217534 | -0.0179579 | -0.0255266 | -0.0251244 |
| text33 | -0.0077153 | -0.0024128 | -0.0054185 | -0.0120708 |
| text34 | -0.0285262 | -0.0315279 | -0.0226315 | -0.0496563 |
| text35 | -0.0232184 | -0.0166659 | -0.0237707 | -0.0301823 |
| text36 | -0.0262233 | -0.0230341 | -0.0085872 | -0.0250824 |
| text37 | -0.0304031 | -0.0265574 | -0.0262080 | -0.0561061 |
| text38 | -0.0205409 | -0.0133237 | -0.0154653 | -0.0080555 |
| text39 | -0.0152019 | -0.0066998 | -0.0012370 | -0.0046716 |
| text40 | -0.0244961 | -0.0150788 | -0.0092616 | -0.0227058 |
| text41 | -0.0250216 | -0.0178983 | -0.0075760 | -0.0134588 |
| text42 | -0.0213407 | -0.0161356 | -0.0025263 | -0.0153482 |
| text43 | -0.0154167 | -0.0081825 | 0.0018929 | -0.0034267 |
| text44 | -0.0315698 | -0.0094676 | 0.0004606 | -0.0182212 |
| text45 | -0.0234959 | -0.0055520 | -0.0118687 | -0.0179219 |
| text46 | -0.0038574 | -0.0021085 | -0.0032150 | -0.0045033 |
| text47 | -0.0476029 | -0.0299750 | -0.0258681 | -0.0646642 |
| text48 | -0.0457408 | 0.0006985 | -0.0353752 | -0.0467143 |
| text49 | -0.0420311 | -0.0328334 | 0.0375300 | -0.0268927 |
| text50 | -0.0230294 | -0.0263774 | -0.0196203 | -0.0488351 |
| text51 | -0.0165408 | 0.0013677 | 0.0100517 | -0.0021014 |
| text52 | -0.0192817 | -0.0033588 | 0.0171124 | 0.0007307 |
| text53 | -0.0176920 | -0.0025925 | 0.0071863 | -0.0002465 |
| text54 | -0.0448211 | -0.0118129 | -0.0146718 | -0.0466950 |
| text55 | -0.0185272 | -0.0025597 | 0.0002995 | -0.0077944 |
| text56 | -0.0275046 | -0.0021669 | 0.0001981 | -0.0069063 |
| text57 | -0.0002411 | -0.0000942 | 0.0001310 | -0.0000092 |
| text58 | -0.0382946 | -0.0163995 | 0.0007255 | -0.0226702 |
| text59 | -0.0165774 | -0.0028263 | 0.0003179 | -0.0076998 |
| text60 | -0.0207104 | 0.0019542 | -0.0035941 | -0.0007140 |
| text61 | -0.0345667 | 0.0412849 | -0.0187940 | -0.0068525 |
| text62 | -0.0304427 | 0.0326191 | -0.0124245 | 0.0022325 |
| text63 | -0.0359581 | 0.0165657 | -0.0183056 | 0.0000682 |
| text64 | -0.0349514 | -0.0233243 | -0.0301906 | -0.0264169 |
| text65 | -0.0351091 | -0.0107101 | -0.0258786 | -0.0098813 |
| text66 | -0.0285421 | -0.0129534 | -0.0129755 | -0.0117214 |
| text67 | -0.0430634 | -0.0538184 | -0.0940910 | 0.0790135 |
| text68 | -0.0333620 | -0.0360427 | -0.0390475 | 0.0500353 |
| text69 | -0.0358999 | -0.0125041 | -0.0450806 | 0.0171264 |
| text70 | -0.0265458 | -0.0180285 | -0.0125030 | -0.0096695 |
| text71 | -0.0330347 | -0.0191165 | -0.0152974 | -0.0107520 |
| text72 | -0.0259106 | -0.0223942 | -0.0080558 | -0.0175451 |
| text73 | -0.0167782 | -0.0020201 | -0.0150208 | -0.0001281 |
| text74 | -0.0274838 | 0.0321909 | -0.0163581 | 0.0081847 |
| text75 | -0.0376141 | -0.0002969 | -0.0192198 | 0.0011115 |
| text76 | -0.0253981 | 0.0020801 | -0.0105971 | -0.0004877 |
| text77 | -0.0328547 | -0.0082745 | -0.0107545 | 0.0038368 |
| text78 | -0.0235799 | 0.0076515 | -0.0092186 | 0.0043661 |
| text79 | -0.0426763 | -0.0026489 | -0.0426432 | 0.0082036 |
| text80 | -0.0340554 | 0.0012138 | -0.0184029 | -0.0121575 |
| text81 | -0.0288114 | 0.0061969 | -0.0043946 | -0.0067574 |
| text82 | -0.0336190 | -0.0060698 | -0.0074208 | -0.0104399 |
| text83 | -0.0107581 | 0.0024605 | -0.0029071 | -0.0008949 |
| text84 | -0.0359517 | -0.0290342 | -0.0052703 | -0.0013052 |
| text85 | -0.0268422 | -0.0230854 | -0.0014778 | -0.0046206 |
| text86 | -0.0379264 | -0.0137758 | -0.0045471 | -0.0040840 |
| text87 | -0.0264541 | -0.0164133 | -0.0107138 | -0.0100730 |
| text88 | -0.0018708 | -0.0026435 | 0.0007437 | 0.0020845 |
| text89 | -0.0345333 | -0.0371732 | -0.0134869 | -0.0137677 |
| text90 | -0.0210573 | -0.0139768 | 0.0080108 | 0.0058116 |
| text91 | -0.0245812 | -0.0177719 | -0.0045580 | -0.0059027 |
| text92 | -0.0176004 | -0.0096606 | 0.0106012 | 0.0062227 |
| text93 | -0.0100241 | -0.0094101 | 0.0066199 | -0.0008196 |
| text94 | -0.0153538 | -0.0094816 | 0.0117699 | 0.0045340 |
| text95 | -0.0198159 | -0.0096780 | 0.0115536 | 0.0063478 |
| text96 | -0.0169387 | -0.0166280 | 0.0020732 | -0.0084961 |
| text97 | -0.0278591 | -0.0283500 | -0.0127402 | -0.0236465 |
| text98 | -0.0308987 | -0.0272103 | -0.0102840 | -0.0246003 |
| text99 | -0.0305988 | -0.0322530 | -0.0182695 | -0.0315139 |
| text100 | -0.0084314 | -0.0068559 | -0.0072867 | -0.0061409 |
| text101 | -0.0155980 | -0.0154976 | -0.0196662 | 0.0254398 |
| text102 | -0.0302287 | -0.0355006 | -0.0387779 | 0.0336160 |
| text103 | -0.0214384 | -0.0286247 | -0.0437458 | 0.0628185 |
| text104 | -0.0321533 | -0.0165488 | -0.0170045 | -0.0089346 |
| text105 | -0.0553076 | -0.0457262 | -0.0707497 | -0.1210271 |
| text106 | -0.0266575 | -0.0067721 | -0.0133371 | -0.0229789 |
| text107 | -0.0483565 | -0.0360432 | -0.0542502 | -0.0968743 |
| text108 | -0.0441772 | -0.0179332 | -0.0185173 | -0.0451624 |
| text109 | -0.0030654 | -0.0013810 | 0.0009570 | -0.0010292 |
| text110 | -0.0244695 | -0.0019428 | 0.0085860 | -0.0066194 |
| text111 | -0.0378921 | -0.0145546 | -0.0277322 | -0.0047162 |
| text112 | -0.0263278 | -0.0144632 | -0.0124616 | -0.0023016 |
| text113 | -0.0230602 | -0.0120708 | -0.0104209 | -0.0205510 |
| text114 | -0.0217847 | -0.0116670 | -0.0009090 | 0.0055827 |
| text115 | -0.0306673 | -0.0088612 | -0.0106969 | -0.0221998 |
| text116 | -0.0197712 | -0.0072945 | -0.0067561 | -0.0202252 |
| text117 | -0.0184897 | -0.0068088 | 0.0121309 | -0.0003057 |
| text118 | -0.0209115 | -0.0091848 | -0.0038574 | -0.0082758 |
| text119 | -0.0216810 | -0.0118997 | 0.0063949 | 0.0026014 |
| text120 | -0.0141134 | -0.0106713 | 0.0047271 | -0.0025073 |
| text121 | -0.0528632 | -0.0329710 | -0.0069018 | -0.0388113 |
| text122 | -0.0184443 | -0.0122901 | -0.0023863 | -0.0188012 |
| text123 | -0.0250909 | -0.0209522 | -0.0137996 | -0.0245978 |
| text124 | -0.0553400 | -0.0271457 | -0.0204519 | -0.0412997 |
| text125 | -0.0411523 | -0.0175170 | -0.0108046 | -0.0263564 |
| text126 | -0.0079568 | -0.0066342 | -0.0019126 | -0.0065682 |
| text127 | -0.0355137 | 0.0015610 | -0.0127652 | -0.0087945 |
| text128 | -0.0155297 | 0.0021304 | -0.0057396 | 0.0045520 |
| text129 | -0.0496970 | -0.0278353 | -0.0357213 | -0.0091834 |
| text130 | -0.0358097 | -0.0125989 | -0.0194836 | -0.0118394 |
| text131 | -0.0433647 | -0.0168190 | -0.0234387 | -0.0133293 |
| text132 | -0.0226695 | -0.0035539 | 0.0046253 | -0.0057726 |
| text133 | -0.0280413 | -0.0063374 | -0.0084255 | -0.0143910 |
| text134 | -0.0460359 | -0.0279474 | -0.0199764 | -0.0365764 |
| text135 | -0.0295928 | 0.0027573 | -0.0011455 | -0.0113531 |
| text136 | -0.0320757 | -0.0011826 | -0.0037077 | -0.0106186 |
| text137 | -0.0411163 | -0.0067814 | -0.0324854 | 0.0039856 |
| text138 | -0.0060010 | -0.0020034 | -0.0020123 | -0.0032373 |
| text139 | -0.0242676 | -0.0096178 | -0.0120453 | -0.0143593 |
| text140 | -0.0339413 | -0.0187144 | -0.0169736 | -0.0272028 |
| text141 | -0.0169068 | -0.0041471 | -0.0084726 | -0.0196592 |
| text142 | -0.0273048 | 0.0437190 | -0.0115942 | 0.0001823 |
| text143 | -0.0243373 | 0.0050499 | 0.0016880 | 0.0049187 |
| text144 | -0.0225731 | -0.0109336 | 0.0030716 | -0.0033375 |
| text145 | -0.0258518 | -0.0135409 | 0.0040152 | -0.0035109 |
| text146 | -0.0214490 | -0.0003692 | -0.0063649 | -0.0066919 |
| text147 | -0.0059924 | -0.0001775 | -0.0007352 | -0.0029434 |
| text148 | -0.0212526 | -0.0068283 | -0.0034633 | -0.0128749 |
| text149 | -0.0252826 | -0.0185539 | -0.0201771 | -0.0121654 |
| text150 | -0.0189356 | -0.0079871 | -0.0074768 | -0.0140191 |
| text151 | -0.0342176 | -0.0159925 | -0.0180575 | -0.0295914 |
| text152 | -0.0218288 | -0.0077337 | -0.0063248 | -0.0118541 |
| text153 | -0.0272256 | -0.0141420 | -0.0154840 | -0.0293550 |
| text154 | -0.0233409 | 0.0108965 | -0.0073353 | -0.0019573 |
| text155 | -0.0239026 | -0.0299776 | -0.0465232 | 0.0686051 |
| text156 | -0.0323097 | -0.0588135 | -0.0828277 | 0.1296122 |
| text157 | -0.0369880 | -0.0479545 | -0.0769620 | 0.1105174 |
| text158 | -0.0005596 | -0.0008190 | -0.0001413 | 0.0007490 |
| text159 | -0.0135582 | 0.0049557 | 0.0043559 | 0.0001592 |
| text160 | -0.0139801 | 0.0100892 | -0.0004842 | 0.0019604 |
| text161 | -0.0186958 | -0.0095949 | -0.0114481 | -0.0201409 |
| text162 | -0.0220879 | -0.0243950 | -0.0129959 | -0.0288409 |
| text163 | -0.0245168 | -0.0209255 | -0.0228963 | -0.0457514 |
| text164 | -0.0186017 | -0.0111364 | 0.0036876 | -0.0044178 |
| text165 | -0.0123882 | -0.0040405 | 0.0009014 | -0.0041077 |
| text166 | -0.0125051 | -0.0035903 | 0.0050714 | -0.0003199 |
| text167 | -0.0141532 | -0.0071592 | 0.0153364 | 0.0021595 |
| text168 | -0.0021862 | -0.0023575 | 0.0048938 | 0.0008592 |
| text169 | -0.0581031 | -0.0243685 | -0.0728200 | -0.1291666 |
| text170 | -0.0483682 | -0.0205083 | -0.0760620 | -0.1333498 |
| text171 | -0.0526862 | -0.0375945 | -0.0691036 | -0.1297437 |
| text172 | -0.0214071 | -0.0148342 | -0.0302543 | -0.0512241 |
| text173 | -0.0286016 | -0.0393821 | -0.0581057 | 0.0898401 |
| text174 | -0.0421514 | -0.0460745 | -0.0612038 | 0.0978455 |
| text175 | -0.0283336 | -0.0295660 | -0.0113124 | 0.0352703 |
| text176 | -0.0328253 | -0.0501779 | -0.0776684 | 0.1012764 |
| text177 | -0.0277876 | -0.0440216 | -0.0688854 | 0.0773207 |
| text178 | -0.0100081 | -0.0120424 | -0.0125501 | 0.0133010 |
| text179 | -0.0594941 | -0.0557857 | -0.0517681 | -0.0170689 |
| text180 | -0.0316960 | -0.0147185 | -0.0157722 | 0.0092686 |
| text181 | -0.0203737 | 0.0019120 | 0.0094904 | -0.0046588 |
| text182 | -0.0146067 | -0.0079428 | -0.0066181 | -0.0114870 |
| text183 | -0.0183836 | -0.0135888 | 0.0017466 | -0.0045810 |
| text184 | -0.0310937 | -0.0314271 | -0.0319630 | 0.0018764 |
| text185 | -0.0301119 | -0.0210522 | -0.0194232 | -0.0236423 |
| text186 | -0.0265107 | -0.0171946 | -0.0272832 | 0.0051195 |
| text187 | -0.0016259 | -0.0009309 | 0.0005541 | 0.0003412 |
| text188 | -0.0410636 | -0.0373388 | -0.0323780 | -0.0396440 |
| text189 | -0.0251457 | -0.0175584 | -0.0109751 | -0.0190943 |
| text190 | -0.0241736 | -0.0179347 | -0.0030616 | -0.0106330 |
| text191 | -0.0031988 | -0.0020848 | 0.0002399 | -0.0012003 |
| text192 | -0.0307658 | -0.0169352 | -0.0254998 | -0.0158452 |
| text193 | -0.0422766 | -0.0289711 | -0.0337097 | -0.0344205 |
| text194 | -0.0290523 | -0.0235478 | -0.0188545 | -0.0230500 |
| text195 | -0.0243519 | -0.0224231 | -0.0041410 | -0.0170766 |
| text196 | -0.0352092 | -0.0232527 | -0.0323703 | -0.0425330 |
| text197 | -0.0339494 | -0.0116640 | -0.0165822 | -0.0264156 |
| text198 | -0.0167808 | 0.0015417 | -0.0043442 | -0.0078236 |
| text199 | -0.0193532 | -0.0078703 | 0.0104935 | 0.0070973 |
| text200 | -0.0179859 | -0.0127310 | 0.0084789 | 0.0023689 |
| text201 | -0.0247454 | -0.0136181 | -0.0055535 | -0.0059973 |
| text202 | -0.0183746 | -0.0116094 | 0.0004419 | -0.0061446 |
| text203 | -0.0259193 | -0.0213849 | -0.0133899 | 0.0040419 |
| text204 | -0.0170592 | -0.0137222 | 0.0075175 | -0.0014403 |
| text205 | -0.0165547 | -0.0028364 | 0.0052890 | -0.0059923 |
| text206 | -0.0200894 | -0.0063188 | 0.0192950 | 0.0045390 |
| text207 | -0.0208254 | -0.0099657 | 0.0047065 | -0.0040495 |
| text208 | -0.0105741 | -0.0062462 | 0.0042699 | -0.0026006 |
| text209 | -0.0463493 | -0.0273675 | -0.0576191 | -0.1109363 |
| text210 | -0.0538506 | 0.0055640 | -0.0315286 | -0.0909411 |
| text211 | -0.0279633 | -0.0050522 | -0.0183756 | -0.0406755 |
| text212 | -0.0281167 | -0.0153242 | -0.0067552 | -0.0116350 |
| text213 | -0.0368608 | -0.0289684 | -0.0190621 | -0.0141507 |
| text214 | -0.0429777 | -0.0177308 | -0.0205200 | -0.0371610 |
| text215 | -0.0511241 | -0.0387311 | -0.0224895 | -0.0418389 |
| text216 | -0.0360945 | -0.0363549 | -0.0206577 | -0.0085738 |
| text217 | -0.0236515 | -0.0110365 | 0.0040819 | -0.0024253 |
| text218 | -0.0201873 | -0.0126703 | 0.0012531 | -0.0031398 |
| text219 | -0.0400465 | -0.0020562 | -0.0009094 | -0.0010607 |
| text220 | -0.0243098 | -0.0070300 | 0.0081210 | 0.0029688 |
| text221 | -0.0168521 | -0.0064033 | 0.0016682 | -0.0087800 |
| text222 | -0.0247187 | -0.0169238 | -0.0010488 | -0.0127382 |
| text223 | -0.0308813 | -0.0291733 | -0.0140609 | -0.0149049 |
| text224 | -0.0190941 | -0.0108555 | -0.0089142 | -0.0095129 |
| text225 | -0.0172604 | -0.0078267 | 0.0017682 | -0.0027785 |
| text226 | -0.0259189 | -0.0002044 | -0.0007589 | -0.0042096 |
| text227 | -0.0183352 | -0.0070793 | 0.0025623 | -0.0078294 |
| text228 | -0.0365834 | -0.0092720 | -0.0008553 | -0.0154160 |
| text229 | -0.0251695 | -0.0075854 | -0.0038987 | -0.0105789 |
| text230 | -0.0233616 | -0.0016702 | -0.0052316 | -0.0166388 |
| text231 | -0.0085763 | -0.0040861 | 0.0040382 | -0.0025586 |
| text232 | -0.0373393 | -0.0361978 | -0.0225761 | -0.0332773 |
| text233 | -0.0272438 | -0.0326019 | -0.0161525 | -0.0139339 |
| text234 | -0.0249118 | -0.0235845 | -0.0086703 | -0.0167931 |
| text235 | -0.0312732 | -0.0275581 | -0.0198023 | -0.0248023 |
| text236 | -0.0159408 | -0.0115990 | -0.0075703 | -0.0207472 |
| text237 | -0.0258005 | -0.0181538 | -0.0079214 | -0.0198984 |
| text238 | -0.0243192 | -0.0166850 | -0.0183418 | -0.0260778 |
| text239 | -0.0199363 | -0.0142079 | -0.0063698 | -0.0197031 |
| text240 | -0.0235569 | -0.0126220 | 0.0053331 | -0.0130550 |
| text241 | -0.0309598 | -0.0204414 | 0.0035421 | -0.0231254 |
| text242 | -0.0187097 | -0.0105601 | 0.0056425 | -0.0069792 |
| text243 | -0.0281958 | -0.0139765 | -0.0122061 | -0.0193861 |
| text244 | -0.0082881 | -0.0078803 | -0.0032054 | -0.0101782 |
| text245 | -0.0294672 | -0.0079816 | -0.0056865 | 0.0012802 |
| text246 | -0.0297295 | 0.0088182 | -0.0036524 | 0.0092163 |
| text247 | -0.0255932 | 0.0105614 | -0.0072916 | 0.0102802 |
| text248 | -0.0223820 | 0.0009284 | -0.0030496 | 0.0075253 |
| text249 | -0.0234716 | 0.0005688 | -0.0035414 | 0.0040829 |
| text250 | -0.0305979 | 0.0018773 | 0.0098079 | 0.0068185 |
| text251 | -0.0174806 | 0.0045174 | 0.0023812 | 0.0029690 |
| text252 | -0.0236933 | -0.0338213 | -0.0732836 | 0.0982147 |
| text253 | -0.0269388 | -0.0384700 | -0.0793850 | 0.1141854 |
| text254 | -0.0282683 | -0.0460974 | -0.0908728 | 0.1282149 |
| text255 | -0.0284676 | -0.0460368 | -0.0815370 | 0.1238795 |
| text256 | -0.0281620 | -0.0500731 | -0.0947602 | 0.1410089 |
| text257 | -0.0351883 | -0.0485965 | -0.0837899 | 0.1003734 |
| text258 | -0.0114792 | -0.0159522 | -0.0174219 | 0.0317869 |
| text259 | -0.0407635 | -0.0350779 | -0.0087689 | -0.0214375 |
| text260 | -0.0341712 | -0.0247707 | -0.0196006 | -0.0180279 |
| text261 | -0.0227580 | -0.0123446 | -0.0140167 | -0.0178309 |
| text262 | -0.0135272 | -0.0088708 | -0.0083216 | -0.0064179 |
| text263 | -0.0380947 | -0.0211108 | -0.0218322 | -0.0192240 |
| text264 | -0.0330569 | -0.0178655 | -0.0192929 | -0.0192518 |
| text265 | -0.0054348 | -0.0038302 | -0.0019785 | -0.0039396 |
| text266 | -0.0399768 | -0.0230691 | -0.0229582 | -0.0085886 |
| text267 | -0.0221749 | -0.0138633 | -0.0120751 | -0.0139932 |
| text268 | -0.0322408 | -0.0196556 | -0.0100926 | -0.0136040 |
| text269 | -0.0202001 | -0.0124859 | -0.0078163 | -0.0142797 |
| text270 | -0.0044583 | -0.0026372 | 0.0002530 | -0.0019551 |
| text271 | -0.0297346 | -0.0054198 | -0.0061643 | -0.0022135 |
| text272 | -0.0194890 | -0.0047769 | 0.0039761 | -0.0009183 |
| text273 | -0.0431601 | 0.0084116 | 0.0045050 | -0.0071400 |
| text274 | -0.0466055 | 0.0032941 | 0.0057470 | -0.0216533 |
| text275 | -0.0288608 | 0.0093559 | -0.0026053 | 0.0026277 |
| text276 | -0.0067919 | -0.0019469 | 0.0018848 | -0.0006070 |
| text277 | -0.0303516 | -0.0242041 | -0.0262786 | -0.0260823 |
| text278 | -0.0326111 | -0.0147446 | -0.0240686 | -0.0170426 |
| text279 | -0.0201657 | -0.0081619 | -0.0085706 | -0.0110846 |
| text280 | -0.0262646 | -0.0199488 | 0.0008773 | -0.0137001 |
| text281 | -0.0397643 | -0.0243785 | -0.0229735 | -0.0458833 |
| text282 | -0.0274489 | -0.0149223 | -0.0212432 | -0.0312039 |
| text283 | -0.0180815 | -0.0098454 | -0.0140733 | -0.0224617 |
| text284 | -0.0172344 | -0.0051750 | 0.0005827 | -0.0085258 |
| text285 | -0.0190598 | -0.0067878 | -0.0046155 | -0.0078051 |
| text286 | -0.0251046 | -0.0096385 | -0.0074740 | 0.0008699 |
| text287 | -0.0248434 | -0.0082188 | -0.0113539 | -0.0233393 |
| text288 | -0.0242661 | -0.0092630 | -0.0030359 | -0.0092863 |
| text289 | -0.0443412 | 0.0156223 | -0.0104090 | -0.0081884 |
| text290 | -0.0270467 | 0.0230244 | -0.0019740 | 0.0003582 |
| text291 | -0.0299375 | -0.0040003 | -0.0070375 | -0.0121910 |
| text292 | -0.0346713 | -0.0036253 | -0.0046902 | -0.0004109 |
| text293 | -0.0411983 | -0.0007309 | -0.0068424 | -0.0179100 |
| text294 | -0.0235474 | 0.0008847 | -0.0010834 | -0.0032301 |
| text295 | -0.0239141 | -0.0137348 | -0.0132229 | -0.0229287 |
| text296 | -0.0277973 | -0.0201516 | -0.0169662 | -0.0308969 |
| text297 | -0.0220373 | -0.0220482 | -0.0229620 | 0.0142208 |
| text298 | -0.0209828 | -0.0251034 | -0.0369027 | 0.0242866 |
| text299 | -0.0172167 | -0.0207196 | -0.0173725 | 0.0347033 |
| text300 | -0.0199645 | -0.0141655 | -0.0019867 | -0.0140629 |
| text301 | -0.0317365 | -0.0403694 | -0.0444473 | 0.0353125 |
| text302 | -0.0055955 | -0.0024581 | -0.0015819 | -0.0005709 |
| text303 | -0.0432379 | 0.0283588 | -0.0100920 | -0.0108971 |
| text304 | -0.0161124 | -0.0074151 | -0.0110046 | -0.0182499 |
| text305 | -0.0394922 | -0.0635659 | -0.0834449 | 0.1448294 |
| text306 | -0.0309482 | -0.0489124 | -0.0553892 | 0.0945847 |
| text307 | -0.0291743 | -0.0330939 | -0.0294646 | 0.0467790 |
| text308 | -0.0327501 | -0.0406849 | -0.0421681 | 0.0819711 |
| text309 | -0.0326729 | -0.0458150 | -0.0665417 | 0.0949785 |
| text310 | -0.0236917 | -0.0162992 | 0.0097466 | 0.0040859 |
| text311 | -0.0157381 | -0.0192759 | -0.0200462 | 0.0336286 |
| text312 | -0.0171310 | -0.0095126 | 0.0113006 | -0.0008799 |
| text313 | -0.0194150 | -0.0111687 | 0.0136703 | 0.0016774 |
| text314 | -0.0205557 | -0.0119516 | 0.0120695 | -0.0043445 |
| text315 | -0.0283431 | -0.0364369 | -0.0170847 | 0.0500241 |
| text316 | -0.0197030 | -0.0136789 | -0.0034523 | 0.0161132 |
| text317 | -0.0163637 | -0.0072506 | 0.0139868 | -0.0008245 |
| text318 | -0.0178156 | -0.0075640 | 0.0075737 | -0.0011730 |
| text319 | -0.0137570 | -0.0129992 | -0.0037327 | 0.0159656 |
| text320 | -0.0366067 | -0.0598723 | -0.0926406 | 0.1219409 |
| text321 | -0.0139817 | -0.0094438 | -0.0189578 | 0.0272923 |
| text322 | -0.0169600 | -0.0240522 | -0.0437432 | 0.0606252 |
| text323 | -0.0217201 | -0.0383301 | -0.0698350 | 0.1059621 |
| text324 | -0.0135565 | -0.0177476 | -0.0200358 | 0.0243249 |
| text325 | -0.0258776 | -0.0361417 | -0.0455831 | 0.0607307 |
| text326 | -0.0197636 | -0.0363579 | -0.0439066 | 0.0799823 |
| text327 | -0.0227788 | -0.0461926 | -0.0576771 | 0.1003570 |
| text328 | -0.0193287 | -0.0348560 | -0.0481457 | 0.0723066 |
| text329 | -0.0220766 | -0.0212481 | -0.0269305 | 0.0324629 |
| text330 | -0.0078599 | -0.0168461 | -0.0263499 | 0.0422776 |
| text331 | -0.0154088 | -0.0046447 | 0.0051342 | 0.0018788 |
| text332 | -0.0144273 | -0.0095262 | -0.0011128 | -0.0075427 |
| text333 | -0.0086210 | -0.0030219 | 0.0031027 | 0.0000791 |
| text334 | -0.0102068 | 0.0002965 | 0.0017731 | 0.0017933 |
| text335 | -0.0102732 | 0.0038088 | -0.0021418 | 0.0009230 |
| text336 | -0.0081120 | -0.0038449 | 0.0026214 | 0.0027991 |
| text337 | -0.0232730 | -0.0184933 | -0.0086261 | -0.0161417 |
| text338 | -0.0238373 | -0.0145818 | 0.0016052 | -0.0072770 |
| text339 | -0.0235203 | -0.0204415 | -0.0072919 | -0.0129097 |
| text340 | -0.0288319 | -0.0176056 | -0.0065668 | -0.0129688 |
| text341 | -0.0235522 | -0.0163492 | -0.0146194 | -0.0221828 |
| text342 | -0.0381256 | -0.0418702 | -0.0495327 | 0.0539658 |
| text343 | -0.0340261 | -0.0379876 | -0.0202655 | 0.0258170 |
| text344 | -0.0129904 | -0.0118274 | -0.0147942 | 0.0086633 |
| text345 | -0.0369682 | -0.0308761 | -0.0610503 | -0.0725546 |
| text346 | -0.0190705 | -0.0197866 | -0.0211130 | -0.0201386 |
| text347 | -0.0368647 | -0.0129850 | -0.0227289 | -0.0505422 |
| text348 | -0.0284511 | -0.0147121 | -0.0277722 | -0.0549649 |
| text349 | -0.0283361 | -0.0119966 | -0.0278210 | -0.0476515 |
| text350 | -0.0312219 | -0.0153928 | -0.0383104 | -0.0633862 |
| text351 | -0.0354287 | -0.0364945 | -0.0231025 | -0.0210937 |
| text352 | -0.0187557 | -0.0184571 | -0.0104987 | 0.0049906 |
| text353 | -0.0254762 | -0.0172014 | 0.0027929 | -0.0083800 |
| text354 | -0.0178243 | -0.0156209 | -0.0000176 | -0.0107426 |
| text355 | -0.0129080 | -0.0072897 | 0.0020499 | -0.0006130 |
| text356 | -0.0169727 | -0.0143247 | 0.0042629 | -0.0024151 |
| text357 | -0.0149944 | -0.0122690 | 0.0016502 | -0.0091188 |
| text358 | -0.0120342 | -0.0074013 | 0.0023342 | -0.0020771 |
| text359 | -0.0263500 | -0.0038815 | 0.0015552 | -0.0013583 |
| text360 | -0.0130518 | -0.0086395 | -0.0050943 | -0.0071921 |
| text361 | -0.0170288 | -0.0148549 | -0.0035735 | -0.0066094 |
| text362 | -0.0167936 | -0.0123261 | 0.0006266 | 0.0002168 |
| text363 | -0.0247595 | -0.0183766 | 0.0032901 | 0.0017089 |
| text364 | -0.0187597 | -0.0089116 | 0.0037847 | -0.0007301 |
| text365 | -0.0148106 | -0.0128299 | -0.0010189 | -0.0053092 |
| text366 | -0.0179649 | -0.0241776 | -0.0274004 | 0.0191663 |
| text367 | -0.0157969 | -0.0084605 | -0.0032078 | -0.0028871 |
| text368 | -0.0280186 | 0.0055407 | -0.0108395 | -0.0105478 |
| text369 | -0.0262151 | 0.0017429 | -0.0080561 | -0.0122108 |
| text370 | -0.0498298 | 0.0218311 | -0.0129995 | -0.0181989 |
| text371 | -0.0282634 | -0.0218950 | -0.0189717 | 0.0273044 |
| text372 | -0.0353378 | -0.0255374 | -0.0308221 | -0.0473844 |
| text373 | -0.0459532 | -0.0406053 | -0.0350964 | -0.0403048 |
| text374 | -0.0305446 | -0.0198270 | -0.0182337 | -0.0240233 |
| text375 | -0.0529340 | -0.0247542 | -0.0442989 | -0.0612563 |
| text376 | -0.0571024 | -0.0333600 | -0.0588866 | -0.0878986 |
| text377 | -0.0622457 | -0.0374083 | -0.0500227 | -0.0751456 |
| text378 | -0.0194441 | -0.0043908 | -0.0041400 | -0.0062433 |
| text379 | -0.0308964 | -0.0202807 | -0.0210040 | -0.0249212 |
| text380 | -0.0316297 | -0.0215283 | -0.0217319 | -0.0300216 |
| text381 | -0.0310951 | -0.0148045 | -0.0073875 | -0.0166905 |
| text382 | -0.0280910 | -0.0120268 | -0.0089425 | -0.0149882 |
| text383 | -0.0188589 | -0.0052827 | -0.0064867 | -0.0103654 |
| text384 | -0.0290477 | 0.0071901 | -0.0029570 | -0.0103970 |
| text385 | -0.0289609 | 0.0149845 | 0.0056491 | -0.0031774 |
| text386 | -0.0326683 | -0.0012851 | -0.0010594 | -0.0092603 |
| text387 | -0.0253714 | -0.0085920 | 0.0057400 | -0.0095991 |
| text388 | -0.0263389 | -0.0188907 | -0.0171537 | -0.0260545 |
| text389 | -0.0298703 | -0.0216602 | -0.0207426 | -0.0271188 |
| text390 | -0.0090423 | -0.0024265 | -0.0021142 | -0.0060335 |
| text391 | -0.0436555 | -0.0183643 | -0.0351154 | 0.0268778 |
| text392 | -0.0497374 | -0.0254897 | -0.0454854 | 0.0116621 |
| text393 | -0.0267461 | -0.0100211 | 0.0039268 | 0.0057595 |
| text394 | -0.0354946 | -0.0260984 | 0.0013988 | -0.0110964 |
| text395 | -0.0204044 | -0.0137410 | 0.0049592 | -0.0033423 |
| text396 | -0.0351794 | -0.0171872 | 0.0082704 | -0.0116442 |
| text397 | -0.0436555 | -0.0183643 | -0.0351154 | 0.0268778 |
| text398 | -0.0497374 | -0.0254897 | -0.0454854 | 0.0116621 |
| text399 | -0.0368629 | -0.0263240 | -0.0513811 | -0.0950757 |
| text400 | -0.0259037 | -0.0219388 | -0.0167730 | -0.0547199 |
| text401 | -0.0368457 | -0.0280660 | -0.0328888 | -0.0797909 |
| text402 | -0.0361311 | -0.0085427 | -0.0430633 | -0.0817451 |
| text403 | -0.0071000 | -0.0051373 | -0.0085707 | -0.0121793 |
| text404 | -0.0368629 | -0.0263240 | -0.0513811 | -0.0950757 |
| text405 | -0.0259037 | -0.0219388 | -0.0167730 | -0.0547199 |
| text406 | -0.0368457 | -0.0280660 | -0.0328888 | -0.0797909 |
| text407 | -0.0361311 | -0.0085427 | -0.0430633 | -0.0817451 |
| text408 | -0.0071000 | -0.0051373 | -0.0085707 | -0.0121793 |
| text409 | -0.0257140 | -0.0188100 | -0.0160243 | -0.0113892 |
| text410 | -0.0319935 | -0.0140543 | -0.0071894 | 0.0004656 |
| text411 | -0.0296585 | -0.0012403 | -0.0094809 | -0.0007853 |
| text412 | -0.0298442 | -0.0107554 | -0.0107297 | -0.0094677 |
| text413 | -0.0221148 | -0.0050586 | -0.0034951 | -0.0007771 |
| text414 | -0.0160776 | -0.0075574 | 0.0078598 | -0.0018173 |
| text415 | -0.0290573 | -0.0298603 | -0.0163727 | 0.0428691 |
| text416 | -0.0272785 | -0.0434421 | -0.0622267 | 0.0857726 |
| text417 | -0.0365294 | -0.0401745 | -0.0274416 | 0.0331779 |
| text418 | -0.0321156 | -0.0462311 | -0.0402292 | 0.0660197 |
| text419 | -0.0386655 | -0.0734811 | -0.0795658 | 0.1348691 |
| text420 | -0.0470763 | -0.0877337 | -0.0948155 | 0.1677701 |
| text421 | -0.0062148 | -0.0138030 | -0.0179831 | 0.0251626 |
| text422 | -0.0666388 | -0.0406980 | -0.0236349 | -0.0461632 |
| text423 | -0.0224205 | -0.0082004 | -0.0084185 | -0.0158633 |
| text424 | -0.0259942 | -0.0339630 | -0.0484243 | 0.0660561 |
| text425 | -0.0136326 | -0.0103713 | -0.0084901 | 0.0136657 |
| text426 | -0.0104640 | -0.0076208 | -0.0015851 | 0.0072655 |
| text427 | -0.0035158 | -0.0047450 | -0.0052485 | 0.0080769 |
| text428 | -0.0288175 | -0.0254508 | -0.0108744 | -0.0216266 |
| text429 | -0.0324753 | -0.0150993 | -0.0107988 | -0.0161364 |
| text430 | -0.0284265 | -0.0127094 | -0.0198099 | -0.0257226 |
| text431 | -0.0403398 | -0.0153993 | -0.0140006 | -0.0333492 |
| text432 | -0.0275269 | 0.0011204 | 0.0019853 | -0.0116905 |
| text433 | -0.0163468 | -0.0047896 | -0.0057490 | -0.0163574 |
| text434 | -0.0258485 | -0.0112923 | -0.0066834 | 0.0273810 |
| text435 | -0.0097345 | -0.0092935 | -0.0095607 | 0.0099838 |
| text436 | -0.0353499 | 0.0111527 | -0.0188634 | -0.0162284 |
| text437 | -0.0362800 | 0.0110523 | -0.0193123 | -0.0101963 |
| text438 | -0.0099450 | 0.0026719 | -0.0035342 | -0.0031696 |
| text439 | -0.0394380 | -0.0313926 | -0.0488659 | 0.0356176 |
| text440 | -0.0521717 | -0.0929257 | -0.1395940 | 0.2120080 |
| text441 | -0.0630690 | -0.0864585 | -0.1308262 | 0.2197539 |
| text442 | -0.0540169 | -0.0346018 | -0.0581156 | -0.0912275 |
| text443 | -0.0315201 | -0.0143905 | -0.0298582 | -0.0482835 |
| text444 | -0.0246648 | -0.0100141 | 0.0069702 | 0.0056486 |
| text445 | -0.0290930 | -0.0221421 | -0.0223553 | 0.0146274 |
| text446 | -0.0221240 | -0.0180501 | -0.0086061 | 0.0107177 |
| text447 | -0.0217602 | -0.0065419 | -0.0036539 | -0.0099928 |
| text448 | -0.0289987 | -0.0071380 | -0.0073654 | -0.0010996 |
| text449 | -0.0188181 | -0.0106611 | 0.0008906 | 0.0002663 |
| text450 | -0.0223994 | -0.0235978 | -0.0232514 | 0.0440035 |
| text451 | -0.0206194 | -0.0124959 | 0.0111289 | 0.0033713 |
| text452 | -0.0224575 | -0.0129213 | -0.0053514 | 0.0097053 |
| text453 | -0.0178326 | -0.0072477 | 0.0027499 | 0.0062848 |
| text454 | -0.0269041 | -0.0165725 | 0.0030305 | -0.0008109 |
| text455 | -0.0226167 | -0.0271995 | -0.0515045 | 0.0818926 |
| text456 | -0.0204869 | -0.0191254 | -0.0333034 | 0.0482212 |
| text457 | -0.0291164 | -0.0281147 | -0.0504570 | 0.0702220 |
| text458 | -0.0224518 | -0.0285410 | -0.0521139 | 0.0700273 |
| text459 | -0.0188767 | -0.0116242 | -0.0274536 | 0.0297682 |
| text460 | -0.0191328 | -0.0264641 | -0.0397172 | 0.0446275 |
| text461 | -0.0181768 | -0.0206559 | -0.0317361 | 0.0464740 |
| text462 | -0.0163838 | -0.0081008 | -0.0033418 | 0.0112726 |
| text463 | -0.0198855 | -0.0112671 | -0.0011881 | 0.0057925 |
| text464 | -0.0138469 | -0.0070880 | -0.0005305 | 0.0014776 |
| text465 | -0.0139600 | 0.0000411 | 0.0020801 | -0.0017403 |
| text466 | -0.0222215 | 0.0028170 | -0.0021774 | -0.0102015 |
| text467 | -0.0275629 | -0.0094447 | -0.0026237 | -0.0152985 |
| text468 | -0.0245210 | -0.0126117 | -0.0057748 | -0.0088792 |
| text469 | -0.0168814 | -0.0138619 | -0.0013222 | -0.0088997 |
| text470 | -0.0172753 | -0.0082475 | -0.0012895 | -0.0074984 |
| text471 | -0.0248524 | -0.0126778 | -0.0068638 | -0.0143674 |
| text472 | -0.0374540 | -0.0219888 | -0.0189632 | -0.0299786 |
| text473 | -0.0204196 | -0.0103161 | -0.0057673 | -0.0094072 |
| text474 | -0.0170217 | -0.0078796 | -0.0059086 | -0.0047804 |
| text475 | -0.0195779 | -0.0129425 | -0.0054777 | -0.0111101 |
| text476 | -0.0100388 | -0.0070754 | 0.0029772 | 0.0035903 |
| text477 | -0.0232511 | -0.0141750 | -0.0037410 | -0.0101510 |
| text478 | -0.0112313 | -0.0025804 | 0.0033011 | -0.0022769 |
| text479 | -0.0272685 | -0.0165422 | -0.0085752 | -0.0095074 |
| text480 | -0.0276039 | -0.0137681 | -0.0118957 | -0.0207805 |
| text481 | -0.0307625 | -0.0145931 | -0.0086340 | -0.0113376 |
| text482 | -0.0289939 | -0.0047711 | -0.0105984 | -0.0101497 |
| text483 | -0.0228572 | -0.0103777 | -0.0079897 | -0.0139090 |
| text484 | -0.0163683 | 0.0000140 | 0.0054737 | -0.0016481 |
| text485 | -0.0258574 | -0.0036192 | -0.0033580 | -0.0070008 |
| text486 | -0.0193827 | -0.0028239 | -0.0039213 | -0.0092121 |
| text487 | -0.0273093 | -0.0136288 | -0.0040016 | -0.0121288 |
| text488 | -0.0237125 | -0.0036381 | -0.0054493 | 0.0002901 |
| text489 | -0.0262060 | -0.0033988 | -0.0101550 | -0.0140819 |
| text490 | -0.0170432 | -0.0039424 | -0.0031450 | -0.0060430 |
| text491 | -0.0156879 | -0.0037063 | -0.0001209 | -0.0022580 |
| text492 | -0.0246378 | -0.0031666 | -0.0059921 | -0.0109874 |
| text493 | -0.0123958 | -0.0040821 | 0.0000009 | -0.0028207 |
| text494 | -0.0173233 | -0.0082997 | 0.0002139 | -0.0050752 |
| text495 | -0.0164074 | -0.0037524 | 0.0000258 | -0.0039861 |
| text496 | -0.0107065 | -0.0054066 | 0.0035581 | 0.0000516 |
| text497 | -0.0243836 | -0.0179051 | 0.0441930 | 0.0047824 |
| text498 | -0.0229361 | -0.0145550 | 0.0040469 | -0.0079476 |
| text499 | -0.0224142 | -0.0064447 | 0.0082815 | -0.0083450 |
| text500 | -0.0168542 | -0.0063025 | 0.0100438 | -0.0042087 |
| text501 | -0.0137495 | -0.0070527 | 0.0106003 | -0.0033582 |
| text502 | -0.0039971 | -0.0031191 | 0.0029456 | 0.0009665 |
| text503 | -0.0260339 | -0.0341148 | -0.0447767 | -0.0224879 |
| text504 | -0.0163277 | -0.0104224 | -0.0100440 | -0.0006218 |
| text505 | -0.0299634 | -0.0314011 | -0.0348223 | 0.0356784 |
| text506 | -0.0158980 | -0.0167301 | -0.0256458 | 0.0317831 |
| text507 | -0.0267255 | -0.0210009 | -0.0173701 | -0.0119095 |
| text508 | -0.0345397 | -0.0271626 | -0.0288387 | -0.0141154 |
| text509 | -0.0397106 | -0.0191892 | -0.0233383 | -0.0114304 |
| text510 | -0.0278651 | -0.0100917 | -0.0210269 | -0.0117908 |
| text511 | -0.0252933 | -0.0193863 | -0.0162777 | -0.0204246 |
| text512 | -0.0197171 | -0.0139813 | -0.0033950 | -0.0076965 |
| text513 | -0.0328372 | -0.0176146 | -0.0213334 | -0.0228441 |
| text514 | -0.0304834 | -0.0249734 | -0.0296548 | -0.0370196 |
| text515 | -0.0331441 | -0.0162640 | -0.0199915 | -0.0324809 |
| text516 | -0.0233027 | -0.0025643 | 0.0019230 | -0.0026824 |
| text517 | -0.0221343 | -0.0151559 | -0.0056782 | -0.0128501 |
| text518 | -0.0127842 | -0.0039192 | 0.0009844 | -0.0040637 |
| text519 | -0.0088202 | -0.0036595 | -0.0005335 | 0.0009489 |
| text520 | -0.0358220 | -0.0100706 | -0.0056607 | -0.0092383 |
| text521 | -0.0334985 | -0.0003751 | -0.0038508 | -0.0082319 |
| text522 | -0.0362507 | -0.0062875 | -0.0150291 | -0.0222107 |
| text523 | -0.0152516 | -0.0044415 | -0.0026254 | -0.0060724 |
| text524 | -0.0124815 | -0.0059775 | 0.0045607 | 0.0006482 |
| text525 | -0.0253115 | -0.0128870 | -0.0002410 | -0.0079617 |
| text526 | -0.0223663 | -0.0142062 | -0.0004192 | -0.0019114 |
| text527 | -0.0064651 | -0.0007996 | -0.0010205 | -0.0019762 |
| text528 | -0.0419100 | -0.0445028 | -0.0181366 | -0.0105359 |
| text529 | -0.0316041 | -0.0332471 | -0.0076392 | -0.0049673 |
| text530 | -0.0221971 | -0.0023990 | 0.0037580 | -0.0061731 |
| text531 | -0.0233231 | -0.0092382 | -0.0023273 | -0.0130151 |
| text532 | -0.0212130 | -0.0096673 | 0.0021028 | -0.0139435 |
| text533 | -0.0225243 | -0.0077080 | -0.0049820 | 0.0054108 |
| text534 | -0.0281983 | -0.0072415 | -0.0027449 | -0.0102983 |
| text535 | -0.0048686 | -0.0032597 | 0.0024478 | -0.0005960 |
| text536 | -0.0397049 | 0.0036639 | -0.0255665 | -0.0143444 |
| text537 | -0.0335563 | 0.0112022 | -0.0250216 | -0.0057308 |
| text538 | -0.0312602 | 0.0136455 | -0.0220721 | -0.0105426 |
| text539 | -0.0286084 | -0.0191753 | -0.0045687 | -0.0014177 |
| text540 | -0.0315681 | -0.0207027 | -0.0090741 | -0.0158584 |
| text541 | -0.0408572 | -0.0334332 | -0.0274171 | -0.0390936 |
| text542 | -0.0227071 | -0.0088737 | -0.0077877 | -0.0120897 |
| text543 | -0.0008521 | -0.0003294 | 0.0008620 | -0.0000378 |
| text544 | -0.0289857 | -0.0269523 | -0.0253070 | 0.0131714 |
| text545 | -0.0230226 | -0.0230582 | -0.0183666 | -0.0000169 |
| text546 | -0.0206089 | -0.0117889 | -0.0024142 | -0.0170095 |
| text547 | -0.0184367 | -0.0074149 | -0.0011514 | -0.0097495 |
| text548 | -0.0320809 | -0.0135392 | 0.0012473 | -0.0094517 |
| text549 | -0.0332832 | -0.0179143 | 0.0079879 | -0.0062893 |
| text550 | -0.0142340 | -0.0058904 | -0.0052391 | -0.0123624 |
| text551 | -0.0383906 | -0.0134813 | -0.0337043 | -0.0335114 |
| text552 | -0.0329496 | -0.0115044 | -0.0240187 | -0.0256918 |
| text553 | -0.0020899 | -0.0011314 | -0.0006082 | -0.0013824 |
| text554 | -0.0280199 | -0.0168308 | 0.0177927 | -0.0011911 |
| text555 | -0.0254454 | -0.0141889 | -0.0066078 | -0.0167946 |
| text556 | -0.0245535 | -0.0244741 | -0.0136829 | 0.0306246 |
| text557 | -0.0089531 | -0.0053729 | 0.0064627 | -0.0003147 |
| text558 | -0.0328503 | -0.0249250 | -0.0228534 | -0.0287535 |
| text559 | -0.0200463 | -0.0162458 | -0.0070225 | -0.0178331 |
| text560 | -0.0179315 | -0.0128595 | -0.0144849 | 0.0240196 |
| text561 | -0.0313647 | -0.0191858 | -0.0214704 | -0.0420757 |
| text562 | -0.0133916 | -0.0077721 | -0.0077712 | -0.0170832 |
| text563 | -0.0303579 | 0.0025277 | -0.0066385 | 0.0003058 |
| text564 | -0.0276357 | 0.0119555 | -0.0076801 | 0.0032605 |
| text565 | -0.0215691 | 0.0084118 | -0.0077565 | -0.0015315 |
| text566 | -0.0251248 | 0.0074443 | 0.0104920 | -0.0019245 |
| text567 | -0.0316186 | 0.0115081 | 0.0068938 | -0.0036949 |
| text568 | -0.0290347 | 0.0206850 | 0.0079980 | -0.0028615 |
| text569 | -0.0037073 | -0.0001436 | 0.0000024 | -0.0009478 |
| text570 | -0.0055505 | 0.0005994 | 0.0037335 | 0.0028113 |
| text571 | -0.0254234 | 0.0457133 | -0.0040744 | -0.0004550 |
| text572 | -0.0622423 | 0.0840710 | -0.0305962 | -0.0202446 |
| text573 | -0.0367017 | 0.0458965 | -0.0206147 | -0.0053002 |
| text574 | -0.0474161 | 0.0685192 | -0.0237016 | -0.0160642 |
| text575 | -0.0502052 | 0.0817140 | -0.0242688 | -0.0055649 |
| text576 | -0.0448352 | 0.0725931 | -0.0144784 | -0.0149813 |
| text577 | -0.0444258 | 0.0664334 | -0.0114701 | -0.0128777 |
| text578 | -0.0118553 | 0.0096390 | 0.0014300 | -0.0024410 |
| text579 | -0.0379341 | 0.0635614 | -0.0010403 | 0.0311827 |
| text580 | -0.0192485 | 0.0386212 | -0.0025918 | 0.0117829 |
| text581 | -0.0419076 | 0.0853981 | -0.0077020 | 0.0158721 |
| text582 | -0.0262598 | 0.0484323 | -0.0053109 | 0.0017422 |
| text583 | -0.0171195 | 0.0052379 | 0.0057537 | 0.0062090 |
| text584 | -0.0173778 | 0.0032560 | 0.0042713 | 0.0080533 |
| text585 | -0.0128889 | 0.0019676 | 0.0010150 | 0.0012857 |
| text586 | -0.0176394 | 0.0107292 | 0.0045012 | 0.0072988 |
| text587 | -0.0266898 | 0.0117117 | 0.0125798 | 0.0125672 |
| text588 | -0.0195852 | -0.0101858 | 0.0078482 | 0.0010991 |
| text589 | -0.0111783 | -0.0020389 | 0.0041864 | 0.0030653 |
| text590 | -0.0178375 | -0.0046055 | 0.0108346 | 0.0064234 |
| text591 | -0.0104228 | 0.0017743 | 0.0056877 | 0.0039272 |
| text592 | -0.0094644 | 0.0018550 | 0.0038253 | 0.0037422 |
| text593 | -0.0257460 | 0.0312200 | -0.0015056 | 0.0029136 |
| text594 | -0.0166574 | 0.0219383 | -0.0014224 | 0.0029402 |
| text595 | -0.0142036 | 0.0123382 | -0.0004934 | 0.0041104 |
| text596 | -0.0190634 | 0.0276949 | -0.0042184 | 0.0027636 |
| text597 | -0.0130055 | 0.0055591 | -0.0017011 | -0.0021029 |
| text598 | -0.0119035 | 0.0124042 | -0.0032786 | -0.0006692 |
| text599 | -0.0164604 | 0.0183039 | 0.0103315 | 0.0085056 |
| text600 | -0.0168153 | 0.0219224 | 0.0085847 | 0.0084119 |
| text601 | -0.0377770 | 0.0722292 | 0.0170910 | 0.0195694 |
| text602 | -0.0302069 | 0.0354168 | 0.0161425 | 0.0077340 |
| text603 | -0.0173197 | 0.0224795 | 0.0072018 | 0.0063572 |
| text604 | -0.0270717 | 0.0094226 | 0.0047732 | -0.0011091 |
| text605 | -0.0456655 | 0.0314239 | 0.0037593 | -0.0086440 |
| text606 | -0.0337375 | 0.0217090 | 0.0120919 | 0.0013086 |
| text607 | -0.0031113 | 0.0008849 | -0.0000324 | 0.0007519 |
| text608 | -0.0451029 | 0.0746581 | -0.0003599 | 0.0030026 |
| text609 | -0.0440016 | 0.0896602 | -0.0065864 | -0.0012923 |
| text610 | -0.0105484 | 0.0169101 | -0.0006183 | 0.0005149 |
| text611 | -0.0291341 | 0.0456318 | 0.0008407 | 0.0244281 |
| text612 | -0.0229634 | 0.0346805 | 0.0036032 | 0.0257683 |
| text613 | -0.0169905 | 0.0164386 | -0.0043894 | 0.0129650 |
| text614 | -0.0189323 | 0.0195810 | -0.0004217 | 0.0208135 |
| text615 | -0.0317825 | 0.0434115 | 0.0054832 | 0.0424141 |
| text616 | -0.0236389 | 0.0477838 | 0.0005176 | 0.0391151 |
| text617 | -0.0222824 | 0.0045553 | 0.0117612 | -0.0010583 |
| text618 | -0.0234194 | 0.0111217 | 0.0175632 | 0.0051022 |
| text619 | -0.0210271 | 0.0074020 | 0.0107578 | 0.0016588 |
| text620 | -0.0022949 | -0.0001246 | 0.0013131 | 0.0009173 |
| text621 | -0.0375433 | 0.0674869 | 0.0054537 | 0.0144207 |
| text622 | -0.0315758 | 0.0634594 | 0.0025177 | 0.0132800 |
| text623 | -0.0373749 | 0.0690006 | -0.0016824 | 0.0187813 |
| text624 | -0.0190790 | 0.0266915 | -0.0031898 | 0.0067003 |
| text625 | -0.0073686 | 0.0069744 | 0.0019274 | 0.0004889 |
| text626 | -0.0171487 | 0.0267521 | 0.0001566 | 0.0122188 |
| text627 | -0.0199639 | 0.0188429 | 0.0019094 | 0.0090927 |
| text628 | -0.0232374 | -0.0208333 | 0.0510754 | 0.0097716 |
| text629 | -0.0111765 | -0.0015993 | 0.0116179 | 0.0039148 |
| text630 | -0.0203616 | 0.0258339 | 0.0014398 | 0.0068628 |
| text631 | -0.0099724 | 0.0038569 | 0.0038264 | 0.0010572 |
| text632 | -0.0005248 | -0.0005730 | 0.0013519 | 0.0002983 |
| text633 | -0.0285555 | 0.0412034 | -0.0003188 | 0.0008426 |
| text634 | -0.0272780 | 0.0486383 | -0.0113116 | -0.0021568 |
| text635 | -0.0268496 | 0.0316665 | -0.0014886 | -0.0042521 |
| text636 | -0.0288033 | 0.0279975 | -0.0091991 | -0.0135648 |
| text637 | -0.0282466 | 0.0378363 | -0.0010545 | -0.0003422 |
| text638 | -0.0270660 | 0.0385656 | -0.0075810 | -0.0040499 |
| text639 | -0.0234600 | 0.0125627 | -0.0052362 | -0.0096868 |
| text640 | -0.0252644 | 0.0144561 | 0.0014817 | -0.0038126 |
| text641 | -0.0256678 | 0.0139949 | 0.0031772 | -0.0042880 |
| text642 | -0.0342026 | 0.0621633 | -0.0195836 | -0.0028403 |
| text643 | -0.0393876 | 0.0648435 | -0.0143551 | -0.0053561 |
| text644 | -0.0293567 | 0.0398283 | -0.0064242 | -0.0055853 |
| text645 | -0.0219733 | 0.0210189 | 0.0018502 | 0.0005097 |
| text646 | -0.0252217 | 0.0268025 | 0.0002884 | -0.0035034 |
| text647 | -0.0339558 | 0.0522816 | -0.0106609 | -0.0078484 |
| text648 | -0.0298373 | 0.0250685 | -0.0060677 | -0.0094424 |
| text649 | -0.0203169 | 0.0167926 | -0.0000537 | 0.0013914 |
| text650 | -0.0071797 | 0.0196397 | -0.0023585 | 0.0024520 |
| text651 | -0.0230640 | 0.0143486 | 0.0027440 | -0.0040740 |
| text652 | -0.0156805 | 0.0034706 | 0.0038375 | -0.0076249 |
| text653 | -0.0303389 | 0.0187975 | -0.0004512 | -0.0092513 |
| text654 | -0.0286835 | 0.0388051 | -0.0045846 | -0.0024053 |
| text655 | -0.0197327 | 0.0189884 | -0.0050106 | -0.0053010 |
| text656 | -0.0263737 | 0.0374947 | -0.0021203 | -0.0029065 |
| text657 | -0.0140629 | 0.0016107 | 0.0069427 | 0.0045253 |
| text658 | -0.0154354 | 0.0024629 | 0.0048529 | 0.0005028 |
| text659 | -0.0251970 | 0.0031320 | 0.0002692 | -0.0057045 |
| text660 | -0.0230712 | 0.0126422 | 0.0033839 | 0.0028966 |
| text661 | -0.0235175 | 0.0240642 | 0.0016189 | -0.0007201 |
| text662 | -0.0184104 | 0.0056458 | -0.0010726 | -0.0047038 |
| text663 | -0.0221459 | 0.0045259 | 0.0059984 | 0.0024481 |
| text664 | -0.0268996 | 0.0040945 | 0.0085298 | -0.0023228 |
| text665 | -0.0220745 | 0.0052996 | 0.0085345 | 0.0010538 |
| text666 | -0.0250124 | 0.0054977 | 0.0159495 | -0.0003882 |
| text667 | -0.0235475 | 0.0011172 | 0.0109022 | -0.0011793 |
| text668 | -0.0122163 | 0.0006434 | 0.0058127 | 0.0001623 |
| text669 | -0.0263437 | 0.0134467 | 0.0043387 | 0.0009806 |
| text670 | -0.0330743 | 0.0306129 | 0.0050873 | -0.0037436 |
| text671 | -0.0255156 | 0.0113589 | 0.0101505 | 0.0041323 |
| text672 | -0.0298980 | 0.0162548 | 0.0099988 | 0.0029024 |
| text673 | -0.0165889 | 0.0027398 | 0.0045045 | -0.0038213 |
| text674 | -0.0201921 | 0.0046594 | 0.0076304 | 0.0010909 |
| text675 | -0.0259793 | 0.0217406 | 0.0019212 | 0.0395126 |
| text676 | -0.0116995 | 0.0150872 | 0.0014315 | 0.0183861 |
| text677 | -0.0184914 | 0.0110918 | 0.0006314 | -0.0054940 |
| text678 | -0.0179973 | 0.0002926 | 0.0002815 | -0.0042074 |
| text679 | -0.0147593 | 0.0046391 | -0.0019491 | -0.0086989 |
| text680 | -0.0150270 | 0.0005598 | -0.0018987 | -0.0034825 |
| text681 | -0.0221121 | 0.0148391 | -0.0012217 | -0.0039918 |
| text682 | -0.0326295 | 0.0221751 | -0.0035419 | -0.0097144 |
| text683 | -0.0175412 | 0.0064427 | 0.0030010 | -0.0044335 |
| text684 | -0.0233178 | 0.0017170 | 0.0071905 | -0.0044254 |
| text685 | -0.0218017 | 0.0017496 | 0.0075261 | 0.0005815 |
| text686 | -0.0008341 | -0.0000484 | 0.0003167 | -0.0003152 |
| text687 | -0.0294826 | 0.0525483 | 0.0025487 | 0.0159175 |
| text688 | -0.0218478 | 0.0594675 | -0.0056036 | 0.0100702 |
| text689 | -0.0324681 | 0.0275891 | 0.0096299 | 0.0958941 |
| text690 | -0.0248408 | 0.0130299 | 0.0063562 | 0.0606159 |
| text691 | -0.0276788 | 0.0424581 | 0.0005674 | 0.0695705 |
| text692 | -0.0330168 | 0.0322118 | 0.0036039 | 0.0742887 |
| text693 | -0.0234116 | 0.0333794 | 0.0013782 | 0.0575950 |
| text694 | -0.0172678 | 0.0123975 | 0.0033785 | 0.0256682 |
| text695 | -0.0265600 | 0.0263979 | 0.0073869 | 0.0201792 |
| text696 | -0.0151218 | 0.0154053 | 0.0073779 | 0.0063240 |
| text697 | -0.0365382 | 0.0430926 | 0.0058845 | 0.0034945 |
| text698 | -0.0314010 | 0.0368271 | 0.0065586 | 0.0013751 |
| text699 | -0.0183055 | 0.0135359 | 0.0023247 | -0.0046471 |
| text700 | -0.0064287 | 0.0062137 | 0.0022448 | 0.0005478 |
| text701 | -0.0201975 | 0.0062831 | 0.0081265 | 0.0041036 |
| text702 | -0.0281952 | 0.0001075 | 0.0182931 | 0.0032085 |
| text703 | -0.0151371 | 0.0026053 | 0.0062732 | 0.0006684 |
| text704 | -0.0143971 | 0.0025503 | 0.0060462 | 0.0025095 |
| text705 | -0.0251493 | 0.0183700 | 0.0040365 | -0.0024029 |
| text706 | -0.0055713 | -0.0011378 | 0.0050554 | 0.0009880 |
| text707 | -0.0177151 | 0.0313001 | -0.0023651 | 0.0034780 |
| text708 | -0.0223308 | 0.0258816 | -0.0016152 | -0.0003392 |
| text709 | -0.0398983 | 0.0569266 | -0.0001252 | 0.0018040 |
| text710 | -0.0258718 | 0.0200117 | -0.0029890 | -0.0037680 |
| text711 | -0.0222132 | 0.0222822 | 0.0001243 | 0.0002503 |
| text712 | -0.0268489 | 0.0537548 | -0.0030923 | 0.0067588 |
| text713 | -0.0196556 | 0.0314323 | -0.0018647 | 0.0014590 |
| text714 | -0.0169696 | 0.0099146 | 0.0060030 | 0.0011907 |
| text715 | -0.0287600 | 0.0233242 | -0.0066637 | 0.0035380 |
| text716 | -0.0484245 | 0.0157601 | -0.0152792 | 0.0094048 |
| text717 | -0.0149937 | 0.0063315 | -0.0017424 | 0.0038337 |
| text718 | -0.0332587 | 0.0352898 | 0.0166731 | 0.0085463 |
| text719 | -0.0271764 | 0.0265306 | 0.0097501 | 0.0037025 |
| text720 | -0.0319614 | 0.0358243 | -0.0019429 | -0.0053223 |
| text721 | -0.0285573 | 0.0414750 | 0.0051234 | 0.0112652 |
| text722 | -0.0244615 | 0.0023326 | 0.0039703 | 0.0195343 |
| text723 | -0.0243068 | -0.0050690 | 0.0148262 | 0.0189064 |
| text724 | -0.0245378 | 0.0062035 | 0.0103577 | 0.0180337 |
| text725 | -0.0209935 | 0.0105354 | 0.0037799 | 0.0207526 |
| text726 | -0.0342847 | 0.0254355 | 0.0052807 | 0.0192268 |
| text727 | -0.0133441 | 0.0133247 | 0.0046170 | 0.0188285 |
| text728 | -0.0181619 | 0.0110725 | 0.0050446 | 0.0120632 |
| text729 | -0.0185992 | 0.0035101 | 0.0061049 | 0.0027559 |
| text730 | -0.0001732 | -0.0000771 | 0.0001284 | 0.0000621 |
| text731 | -0.0143044 | 0.0282050 | -0.0000299 | 0.0061347 |
| text732 | -0.0278966 | 0.0388279 | 0.0011675 | 0.0032374 |
| text733 | -0.0227865 | 0.0382998 | -0.0009308 | 0.0050350 |
| text734 | -0.0291057 | 0.0421191 | 0.0064457 | 0.0088821 |
| text735 | -0.0281958 | 0.0590898 | -0.0049763 | 0.0086137 |
| text736 | -0.0306648 | 0.0602258 | -0.0037982 | 0.0058902 |
| text737 | -0.0233533 | 0.0276011 | -0.0001171 | -0.0014693 |
| text738 | -0.0266427 | 0.0238504 | -0.0023232 | 0.0018872 |
| text739 | -0.0411281 | 0.0520672 | 0.0002702 | 0.0065194 |
| text740 | -0.0052262 | 0.0024649 | 0.0009380 | 0.0000647 |
| text741 | -0.0177528 | 0.0025667 | 0.0105687 | 0.0058623 |
| text742 | -0.0246961 | -0.0034134 | 0.0135816 | 0.0005735 |
| text743 | -0.0206662 | 0.0012294 | 0.0146495 | 0.0012764 |
| text744 | -0.0155068 | -0.0056620 | 0.0143713 | -0.0006556 |
| text745 | -0.0126745 | -0.0065477 | 0.0051615 | -0.0007450 |
| text746 | -0.0068510 | -0.0009971 | 0.0034338 | -0.0008259 |
| text747 | -0.0231048 | 0.0382069 | -0.0004206 | 0.0147966 |
| text748 | -0.0272097 | 0.0449351 | 0.0002707 | 0.0114755 |
| text749 | -0.0232822 | 0.0297316 | 0.0034152 | -0.0004784 |
| text750 | -0.0239380 | 0.0345408 | 0.0007213 | -0.0013759 |
| text751 | -0.0003473 | 0.0000109 | 0.0001063 | 0.0001000 |
| text752 | -0.0358348 | 0.0404270 | -0.0069765 | 0.0109885 |
| text753 | -0.0148963 | 0.0167376 | -0.0007375 | 0.0057612 |
| text754 | -0.0307180 | 0.0360638 | 0.0055145 | -0.0020696 |
| text755 | -0.0384901 | 0.0450181 | -0.0007546 | -0.0063306 |
| text756 | -0.0201793 | 0.0125803 | 0.0005105 | -0.0026977 |
| text757 | -0.0247000 | 0.0219591 | 0.0042964 | -0.0018364 |
| text758 | -0.0260638 | 0.0241676 | 0.0048360 | 0.0004432 |
| text759 | -0.0340141 | 0.0290120 | -0.0078927 | -0.0131849 |
| text760 | -0.0306821 | 0.0360198 | -0.0002326 | -0.0010770 |
| text761 | -0.0410060 | 0.0364852 | -0.0135248 | -0.0222282 |
| text762 | -0.0318848 | 0.0345422 | -0.0001567 | 0.0002855 |
| text763 | -0.0108322 | 0.0100202 | 0.0000118 | 0.0004832 |
| text764 | -0.0231008 | 0.0012211 | 0.0146265 | 0.0045819 |
| text765 | -0.0272630 | 0.0069008 | 0.0198868 | 0.0065063 |
| text766 | -0.0205631 | -0.0071754 | 0.0089083 | -0.0041171 |
| text767 | -0.0270629 | 0.0098756 | 0.0107202 | -0.0030073 |
| text768 | -0.0311228 | 0.0057800 | 0.0150993 | -0.0020896 |
| text769 | -0.0336961 | 0.0150366 | 0.0095644 | 0.0009296 |
| text770 | -0.0064343 | -0.0020907 | 0.0063269 | 0.0002337 |
| text771 | -0.0289891 | 0.0602131 | -0.0024564 | 0.0158551 |
| text772 | -0.0266258 | 0.0366021 | 0.0018702 | 0.0098373 |
| text773 | -0.0352914 | 0.0628796 | -0.0023353 | 0.0322223 |
| text774 | -0.0238592 | 0.0685545 | -0.0029754 | 0.0382096 |
| text775 | -0.0291969 | 0.0644272 | -0.0029621 | 0.0460548 |
| text776 | -0.0233778 | 0.0309343 | -0.0016700 | 0.0095488 |
| text777 | -0.0228718 | 0.0142099 | 0.0136940 | 0.0112264 |
| text778 | -0.0356685 | 0.0058738 | 0.0170273 | 0.0008987 |
| text779 | -0.0144526 | 0.0007495 | 0.0121131 | -0.0006222 |
| text780 | -0.0213932 | 0.0022029 | 0.0079014 | -0.0005303 |
| text781 | -0.0022675 | -0.0001407 | 0.0018768 | 0.0000027 |
| text782 | -0.0323464 | 0.0407789 | 0.0029929 | 0.0054801 |
| text783 | -0.0252892 | 0.0362123 | 0.0025821 | 0.0114609 |
| text784 | -0.0169707 | 0.0203712 | 0.0010805 | 0.0058664 |
| text785 | -0.0369689 | 0.0437560 | 0.0112681 | 0.0121757 |
| text786 | -0.0245449 | 0.0345591 | 0.0050451 | 0.0056654 |
| text787 | -0.0205265 | 0.0247393 | -0.0024719 | -0.0030624 |
| text788 | -0.0237019 | 0.0302302 | -0.0005076 | -0.0042072 |
| text789 | -0.0186857 | 0.0203644 | 0.0001090 | -0.0025428 |
| text790 | -0.0240670 | 0.0150798 | 0.0075725 | 0.0007033 |
| text791 | -0.0268843 | -0.0025208 | 0.0098521 | -0.0030419 |
| text792 | -0.0167968 | 0.0015056 | 0.0068416 | 0.0008872 |
| text793 | -0.0190150 | 0.0085609 | 0.0078822 | 0.0086992 |
| text794 | -0.0201708 | 0.0193689 | -0.0002546 | 0.0085273 |
| text795 | -0.0307276 | 0.0650878 | -0.0020586 | 0.0142869 |
| text796 | -0.0145188 | 0.0244001 | 0.0023468 | 0.0117541 |
| text797 | -0.0295926 | 0.0598833 | -0.0003212 | 0.0205994 |
| text798 | -0.0223948 | 0.0384099 | -0.0015949 | 0.0153246 |
| text799 | -0.0204531 | 0.0264376 | 0.0038712 | 0.0064874 |
| text800 | -0.0109601 | 0.0017521 | 0.0037075 | 0.0040081 |
| text801 | -0.0152867 | 0.0110898 | 0.0019766 | 0.0067568 |
| text802 | -0.0130945 | 0.0048574 | 0.0000126 | 0.0039167 |
| text803 | -0.0303301 | 0.0362430 | 0.0055545 | 0.0119072 |
| text804 | -0.0346882 | 0.0377897 | 0.0023708 | 0.0028386 |
| text805 | -0.0347460 | 0.0336957 | 0.0042148 | 0.0065983 |
| text806 | -0.0318949 | 0.0420702 | -0.0194040 | -0.0063355 |
| text807 | -0.0119577 | 0.0104761 | -0.0050665 | -0.0044619 |
| text808 | -0.0213345 | 0.0414209 | -0.0017419 | 0.0194532 |
| text809 | -0.0355143 | 0.0856837 | -0.0028925 | 0.0260430 |
| text810 | -0.0318531 | 0.0572162 | -0.0006690 | 0.0163769 |
| text811 | -0.0286625 | 0.0520201 | -0.0058168 | 0.0136456 |
| text812 | -0.0332555 | 0.0115350 | -0.0129684 | -0.0288393 |
| text813 | -0.0460313 | 0.0130825 | -0.0187616 | -0.0504652 |
| text814 | -0.0380797 | 0.0328016 | -0.0179468 | -0.0190681 |
| text815 | -0.0163871 | 0.0067259 | -0.0085099 | -0.0116168 |
| text816 | -0.0209306 | 0.0063505 | 0.0002919 | 0.0039765 |
| text817 | -0.0153556 | 0.0043953 | 0.0043424 | 0.0061796 |
| text818 | -0.0174380 | 0.0297192 | 0.0028518 | 0.0112530 |
| text819 | -0.0099384 | -0.0020902 | 0.0061286 | 0.0011073 |
| text820 | -0.0130774 | 0.0060792 | 0.0044857 | 0.0049566 |
| text821 | -0.0250987 | 0.0104957 | 0.0003294 | 0.0066760 |
| text822 | -0.0218886 | 0.0206634 | -0.0050679 | 0.0073793 |
| text823 | -0.0120309 | 0.0061304 | 0.0022196 | 0.0028851 |
| text824 | -0.0094658 | 0.0008523 | 0.0009146 | 0.0019154 |
| text825 | -0.0015952 | 0.0049794 | -0.0002685 | 0.0010653 |
| text826 | -0.0448826 | 0.0921015 | -0.0154622 | 0.0235030 |
| text827 | -0.0451989 | 0.0832044 | -0.0064745 | 0.0293726 |
| text828 | -0.0443781 | 0.0381317 | -0.0191933 | 0.0036721 |
| text829 | -0.0380980 | 0.0442711 | -0.0075811 | -0.0011929 |
| text830 | -0.0147009 | 0.0097894 | -0.0002952 | -0.0032324 |
| text831 | -0.0351014 | 0.0147570 | 0.0104433 | -0.0026567 |
| text832 | -0.0443763 | 0.0127821 | 0.0223146 | 0.0079202 |
| text833 | -0.0400392 | 0.0345471 | -0.0114330 | -0.0134345 |
| text834 | -0.0448987 | 0.0579230 | -0.0127463 | -0.0154831 |
| text835 | -0.0464682 | 0.0367860 | -0.0044169 | -0.0090635 |
| text836 | -0.0154721 | 0.0067464 | 0.0026355 | -0.0022947 |
| text837 | -0.0112470 | 0.0142600 | 0.0001295 | 0.0038449 |
| text838 | -0.0282485 | 0.0267754 | -0.0074074 | 0.0027968 |
| text839 | -0.0222009 | 0.0219108 | -0.0090204 | 0.0007667 |
| text840 | -0.0186223 | 0.0167244 | -0.0081306 | -0.0045538 |
| text841 | -0.0259896 | 0.0394396 | -0.0083757 | -0.0045524 |
| text842 | -0.0213902 | 0.0247613 | -0.0047969 | -0.0036327 |
| text843 | -0.0282086 | 0.0374140 | -0.0057775 | -0.0057913 |
| text844 | -0.0431235 | 0.0405747 | 0.0173430 | 0.0026627 |
| text845 | -0.0482202 | 0.0464304 | 0.0149178 | -0.0061781 |
| text846 | -0.0069274 | 0.0040227 | 0.0005559 | -0.0015400 |
| text847 | -0.0329974 | 0.0251860 | -0.0007935 | -0.0109707 |
| text848 | -0.0402267 | 0.0616039 | -0.0043738 | -0.0061732 |
| text849 | -0.0465591 | 0.0892869 | -0.0085016 | -0.0043306 |
| text850 | -0.0365779 | 0.0586351 | -0.0060465 | -0.0070547 |
| text851 | -0.0002838 | -0.0000618 | 0.0002489 | 0.0000980 |
| text852 | -0.0456035 | 0.0241563 | -0.0046082 | -0.0075891 |
| text853 | -0.0360190 | 0.0114290 | -0.0095300 | -0.0141206 |
| text854 | -0.0465355 | 0.0347736 | -0.0069596 | -0.0072649 |
| text855 | -0.0152913 | 0.0035681 | -0.0000454 | 0.0017518 |
| text856 | -0.0229131 | 0.0273527 | -0.0067771 | 0.0009098 |
| text857 | -0.0387827 | 0.0676602 | -0.0176577 | 0.0019284 |
| text858 | -0.0348759 | 0.0693582 | -0.0251489 | -0.0046311 |
| text859 | -0.0346929 | 0.0498598 | -0.0126207 | -0.0100523 |
| text860 | -0.0433119 | 0.0879290 | -0.0083658 | 0.0058832 |
| text861 | -0.0242736 | 0.0372177 | -0.0004110 | 0.0375299 |
| text862 | -0.0341259 | 0.0820039 | -0.0051406 | 0.0254618 |
| text863 | -0.0445230 | 0.0753160 | 0.0013732 | 0.0185358 |
| text864 | -0.0025641 | 0.0009739 | -0.0001450 | 0.0006092 |
| text865 | -0.0142168 | 0.0089706 | 0.0076364 | 0.0021647 |
| text866 | -0.0297039 | 0.0299150 | 0.0060302 | 0.0067023 |
| text867 | -0.0349369 | 0.0410755 | 0.0016136 | -0.0011405 |
| text868 | -0.0231776 | 0.0628362 | -0.0055439 | 0.0136257 |
| text869 | -0.0415899 | 0.1032760 | -0.0090069 | 0.0224176 |
| text870 | -0.0339438 | 0.0674425 | -0.0089763 | 0.0076707 |
| text871 | -0.0337707 | 0.0408385 | 0.0054640 | 0.0182210 |
| text872 | -0.0402399 | 0.0897367 | -0.0085321 | 0.0224733 |
| text873 | -0.0320699 | 0.0597322 | -0.0020572 | 0.0112293 |
| text874 | -0.0171929 | 0.0219193 | 0.0007448 | 0.0060973 |
| text875 | -0.0474566 | 0.0781342 | -0.0059912 | 0.0100717 |
| text876 | -0.0331743 | 0.0567434 | -0.0058854 | 0.0031016 |
| text877 | -0.0199596 | 0.0375070 | -0.0036501 | 0.0007932 |
| text878 | -0.0528764 | 0.0888338 | -0.0128526 | -0.0081905 |
| text879 | -0.0063761 | 0.0066353 | 0.0022164 | 0.0005248 |
| text880 | -0.0356680 | 0.0221865 | 0.0162887 | 0.0038154 |
| text881 | -0.0337523 | 0.0040092 | 0.0171010 | 0.0024610 |
| text882 | -0.0317422 | 0.0239174 | 0.0090630 | 0.0029088 |
| text883 | -0.0090004 | 0.0067657 | 0.0011859 | 0.0005803 |
| text884 | -0.0198740 | 0.0294607 | 0.0015478 | 0.0003065 |
| text885 | -0.0235745 | 0.0140956 | 0.0049738 | -0.0030015 |
| text886 | -0.0210413 | 0.0080306 | 0.0057263 | -0.0023939 |
| text887 | -0.0181039 | 0.0187042 | 0.0031963 | -0.0011140 |
| text888 | -0.0097447 | 0.0097773 | 0.0008938 | 0.0013136 |
| text889 | -0.0383505 | 0.0555825 | -0.0061476 | -0.0050629 |
| text890 | -0.0510995 | 0.0372241 | -0.0168786 | -0.0291522 |
| text891 | -0.0531184 | 0.0984009 | -0.0132113 | -0.0055529 |
| text892 | -0.0014824 | 0.0020201 | 0.0001771 | 0.0014040 |
| text893 | -0.0220195 | 0.0329307 | -0.0037088 | -0.0002855 |
| text894 | -0.0543344 | 0.0842363 | -0.0323526 | 0.0032761 |
| text895 | -0.0288419 | 0.0317985 | -0.0159975 | -0.0094430 |
| text896 | -0.0191795 | 0.0272518 | -0.0095959 | -0.0046681 |
| text897 | -0.0282684 | 0.0433301 | -0.0107100 | -0.0100616 |
| text898 | -0.0052711 | 0.0018991 | -0.0020203 | -0.0039901 |
| text899 | -0.0202102 | 0.0176327 | -0.0001023 | 0.0043841 |
| text900 | -0.0198055 | 0.0180507 | -0.0006327 | 0.0063320 |
| text901 | -0.0190382 | -0.0027098 | -0.0008562 | 0.0032137 |
| text902 | -0.0149755 | 0.0088588 | 0.0007665 | 0.0048807 |
| text903 | -0.0257757 | 0.0305526 | 0.0016962 | 0.0053075 |
| text904 | -0.0080021 | 0.0068835 | 0.0010278 | 0.0006646 |
| text905 | -0.0250172 | 0.0246492 | -0.0034599 | -0.0007031 |
| text906 | -0.0315075 | 0.0342969 | -0.0123212 | -0.0015932 |
| text907 | -0.0227468 | 0.0394497 | -0.0064555 | 0.0105014 |
| text908 | -0.0325728 | 0.0616267 | -0.0078691 | 0.0116351 |
| text909 | -0.0243424 | 0.0372922 | -0.0057637 | 0.0146793 |
| text910 | -0.0103048 | 0.0099929 | 0.0003829 | 0.0009863 |
| text911 | -0.0364169 | 0.1095102 | -0.0114897 | 0.0081074 |
| text912 | -0.0451471 | 0.1048550 | -0.0176215 | -0.0071164 |
| text913 | -0.0489680 | 0.1233122 | -0.0259546 | -0.0099902 |
| text914 | -0.0296118 | 0.0652220 | -0.0040458 | 0.0373704 |
| text915 | -0.0345062 | 0.0940712 | -0.0024001 | 0.0577295 |
| text916 | -0.0277423 | 0.0585962 | 0.0066054 | 0.0332074 |
| text917 | -0.0463057 | 0.0757675 | 0.0075516 | 0.0227871 |
| text918 | -0.0270947 | 0.0317818 | 0.0038849 | 0.0078837 |
| text919 | -0.0201617 | 0.0241444 | 0.0010432 | 0.0016828 |
| text920 | -0.0330667 | 0.0579993 | -0.0130458 | -0.0148843 |
| text921 | -0.0259331 | 0.0189877 | 0.0034169 | -0.0070681 |
| text922 | -0.0240337 | 0.0259808 | 0.0143502 | 0.0055267 |
| text923 | -0.0242736 | 0.0372177 | -0.0004110 | 0.0375299 |
| text924 | -0.0341259 | 0.0820039 | -0.0051406 | 0.0254618 |
| text925 | -0.0445230 | 0.0753160 | 0.0013732 | 0.0185358 |
| text926 | -0.0025641 | 0.0009739 | -0.0001450 | 0.0006092 |
| text927 | -0.0463057 | 0.0757675 | 0.0075516 | 0.0227871 |
| text928 | -0.0270947 | 0.0317818 | 0.0038849 | 0.0078837 |
| text929 | -0.0201617 | 0.0241444 | 0.0010432 | 0.0016828 |
| text930 | -0.0330667 | 0.0579993 | -0.0130458 | -0.0148843 |
| text931 | -0.0259331 | 0.0189877 | 0.0034169 | -0.0070681 |
| text932 | -0.0240337 | 0.0259808 | 0.0143502 | 0.0055267 |
| text933 | -0.0271102 | 0.0663286 | -0.0046526 | 0.0116985 |
| text934 | -0.0528747 | 0.1117912 | -0.0113337 | 0.0096725 |
| text935 | -0.0093253 | 0.0181677 | -0.0035374 | 0.0020244 |
| text936 | -0.0452250 | 0.0890385 | -0.0220151 | 0.0045936 |
| text937 | -0.0317083 | 0.0635611 | -0.0139746 | -0.0009301 |
| text938 | -0.0364169 | 0.1095102 | -0.0114897 | 0.0081074 |
| text939 | -0.0451471 | 0.1048550 | -0.0176215 | -0.0071164 |
| text940 | -0.0489680 | 0.1233122 | -0.0259546 | -0.0099902 |
| text941 | -0.0296118 | 0.0652220 | -0.0040458 | 0.0373704 |
| text942 | -0.0345062 | 0.0940712 | -0.0024001 | 0.0577295 |
| text943 | -0.0277423 | 0.0585962 | 0.0066054 | 0.0332074 |
| text944 | -0.0198740 | 0.0294607 | 0.0015478 | 0.0003065 |
| text945 | -0.0235745 | 0.0140956 | 0.0049738 | -0.0030015 |
| text946 | -0.0210413 | 0.0080306 | 0.0057263 | -0.0023939 |
| text947 | -0.0181039 | 0.0187042 | 0.0031963 | -0.0011140 |
| text948 | -0.0097447 | 0.0097773 | 0.0008938 | 0.0013136 |
| text949 | -0.0321612 | 0.0212443 | -0.0028927 | 0.0025856 |
| text950 | -0.0220232 | 0.0019759 | -0.0084276 | -0.0006875 |
| text951 | -0.0329266 | 0.0180397 | 0.0044509 | 0.0162967 |
| text952 | -0.0372411 | 0.0641975 | -0.0164971 | 0.0083752 |
| text953 | -0.0371450 | 0.0806741 | -0.0082700 | 0.0168961 |
| text954 | -0.0302795 | 0.0467986 | 0.0006462 | 0.0103485 |
| text955 | -0.0338505 | 0.0331593 | -0.0014993 | 0.0091597 |
| text956 | -0.0090037 | 0.0045713 | -0.0005206 | 0.0025432 |
| text957 | -0.0425550 | 0.0860514 | 0.0016265 | 0.1127850 |
| text958 | -0.0246766 | 0.0391376 | -0.0027759 | 0.0396524 |
| text959 | -0.0173665 | 0.0058163 | 0.0070624 | 0.0049658 |
| text960 | -0.0248105 | 0.0035831 | 0.0052525 | 0.0018951 |
| text961 | -0.0366646 | 0.0183849 | 0.0022820 | -0.0032275 |
| text962 | -0.0294316 | 0.0329813 | 0.0007769 | 0.0010161 |
| text963 | -0.0138445 | 0.0002658 | -0.0012837 | -0.0072304 |
| text964 | -0.0408020 | 0.0262022 | -0.0270340 | 0.0416703 |
| text965 | -0.0241383 | 0.0233672 | -0.0148032 | 0.0214015 |
| text966 | -0.0341173 | 0.0203013 | 0.0010166 | 0.0288348 |
| text967 | -0.0148961 | 0.0205519 | -0.0007146 | 0.0158165 |
| text968 | -0.0268821 | 0.0270280 | 0.0127560 | 0.0444791 |
| text969 | -0.0252640 | 0.0337439 | 0.0139745 | 0.0510778 |
| text970 | -0.0249958 | 0.0310170 | 0.0110040 | 0.0359319 |
| text971 | -0.0470272 | -0.0326545 | 0.1208566 | 0.0351302 |
| text972 | -0.0125381 | -0.0085591 | 0.0302913 | 0.0038915 |
| text973 | -0.0154731 | -0.0084471 | 0.0286613 | 0.0070375 |
| text974 | -0.0173360 | 0.0020957 | 0.0214444 | 0.0102382 |
| text975 | -0.0056131 | 0.0016355 | 0.0053966 | 0.0020100 |
| text976 | -0.0222003 | -0.0175115 | 0.0060163 | -0.0009680 |
| text977 | -0.0138098 | -0.0122621 | 0.0032728 | -0.0006330 |
| text978 | -0.0178510 | -0.0148211 | 0.0041498 | -0.0030059 |
| text979 | -0.0115464 | -0.0083463 | 0.0004459 | -0.0028709 |
| text980 | -0.0232325 | -0.0168376 | 0.0059982 | -0.0025427 |
| text981 | -0.0210325 | -0.0152036 | 0.0032270 | 0.0012958 |
| text982 | -0.0285603 | -0.0179626 | 0.0046903 | -0.0044626 |
| text983 | -0.0220295 | -0.0154417 | 0.0026191 | -0.0040239 |
| text984 | -0.0256285 | -0.0235578 | -0.0010661 | 0.0026046 |
| text985 | -0.0255021 | -0.0174213 | 0.0197083 | -0.0001143 |
| text986 | -0.0209018 | -0.0130389 | 0.0092151 | -0.0030083 |
| text987 | -0.0147851 | -0.0081150 | 0.0104972 | -0.0003340 |
| text988 | -0.0275899 | -0.0180239 | 0.0399046 | 0.0036791 |
| text989 | -0.0348991 | -0.0182770 | 0.0219337 | 0.0072466 |
| text990 | -0.0372703 | -0.0146489 | 0.0330519 | -0.0004354 |
| text991 | -0.0371351 | -0.0167497 | 0.0665862 | 0.0152198 |
| text992 | -0.0354979 | -0.0191796 | 0.0248302 | 0.0000627 |
| text993 | -0.0243754 | -0.0015298 | 0.0257585 | 0.0043978 |
| text994 | -0.0012240 | -0.0003165 | 0.0010725 | -0.0003083 |
| text995 | -0.0349054 | -0.0130201 | 0.0155796 | -0.0066277 |
| text996 | -0.0221590 | -0.0074224 | 0.0168541 | -0.0023183 |
| text997 | -0.0199178 | -0.0149241 | 0.0510025 | 0.0112930 |
| text998 | -0.0207668 | -0.0188404 | 0.0484058 | 0.0084296 |
| text999 | -0.0250540 | -0.0185392 | 0.0384077 | 0.0092102 |
| text1000 | -0.0155849 | -0.0124551 | 0.0322648 | 0.0077263 |
| text1001 | -0.0091374 | -0.0058421 | 0.0166754 | 0.0026362 |
| text1002 | -0.0241932 | -0.0089632 | 0.0289038 | 0.0055310 |
| text1003 | -0.0280282 | -0.0191892 | 0.0519701 | 0.0120413 |
| text1004 | -0.0204261 | -0.0117114 | 0.0351232 | 0.0071226 |
| text1005 | -0.0169739 | -0.0084673 | 0.0265157 | 0.0052482 |
| text1006 | -0.0084076 | -0.0048086 | 0.0066705 | 0.0001065 |
| text1007 | -0.0144581 | -0.0107126 | 0.0143189 | 0.0070735 |
| text1008 | -0.0284418 | -0.0132638 | 0.0401672 | 0.0112455 |
| text1009 | -0.0297868 | -0.0126024 | 0.0388401 | 0.0072836 |
| text1010 | -0.0254401 | -0.0055932 | 0.0237178 | 0.0064030 |
| text1011 | -0.0284401 | -0.0048377 | 0.0226645 | -0.0022219 |
| text1012 | -0.0246866 | -0.0142298 | 0.0233254 | -0.0028564 |
| text1013 | -0.0208002 | -0.0119017 | 0.0233529 | 0.0030591 |
| text1014 | -0.0131403 | -0.0085198 | 0.0189615 | 0.0039824 |
| text1015 | -0.0210748 | -0.0187154 | 0.0401971 | 0.0141927 |
| text1016 | -0.0239772 | -0.0118048 | 0.0337362 | 0.0024035 |
| text1017 | -0.0226982 | -0.0114860 | 0.0383857 | 0.0049245 |
| text1018 | -0.0071220 | -0.0053205 | 0.0161619 | 0.0027188 |
| text1019 | -0.0145617 | -0.0090597 | 0.0167397 | 0.0061996 |
| text1020 | -0.0114581 | -0.0054636 | 0.0091508 | -0.0015005 |
| text1021 | -0.0130699 | -0.0028457 | 0.0100625 | 0.0002419 |
| text1022 | -0.0162413 | -0.0020402 | 0.0126287 | 0.0040269 |
| text1023 | -0.0220275 | -0.0036692 | 0.0221310 | 0.0038919 |
| text1024 | -0.0293900 | -0.0201911 | 0.0387718 | 0.0049212 |
| text1025 | -0.0140034 | -0.0099257 | 0.0192371 | 0.0039863 |
| text1026 | -0.0178090 | -0.0117884 | 0.0168590 | 0.0016373 |
| text1027 | -0.0284837 | -0.0177239 | 0.0345744 | 0.0012297 |
| text1028 | -0.0290565 | -0.0094444 | 0.0442804 | 0.0079888 |
| text1029 | -0.0198982 | -0.0106255 | 0.0295156 | 0.0028312 |
| text1030 | -0.0410261 | -0.0303065 | 0.0807453 | 0.0145319 |
| text1031 | -0.0265166 | -0.0238923 | 0.0623198 | 0.0085341 |
| text1032 | -0.0128090 | -0.0054520 | 0.0203868 | 0.0033384 |
| text1033 | -0.0160658 | -0.0110982 | 0.0235756 | 0.0055639 |
| text1034 | -0.0246089 | -0.0188022 | 0.0267044 | 0.0055690 |
| text1035 | -0.0172468 | -0.0143210 | 0.0312954 | 0.0057285 |
| text1036 | -0.0176813 | -0.0110164 | 0.0158770 | 0.0028567 |
| text1037 | -0.0194000 | -0.0127450 | 0.0206139 | 0.0077333 |
| text1038 | -0.0061657 | -0.0026137 | 0.0018255 | 0.0040065 |
| text1039 | -0.0092464 | -0.0046801 | 0.0048941 | 0.0029988 |
| text1040 | -0.0179439 | -0.0110591 | 0.0177932 | 0.0038900 |
| text1041 | -0.0139677 | -0.0121509 | 0.0192505 | 0.0035578 |
| text1042 | -0.0192171 | -0.0100091 | 0.0200742 | 0.0050517 |
| text1043 | -0.0171263 | -0.0140020 | 0.0299651 | 0.0065148 |
| text1044 | -0.0010685 | -0.0008475 | 0.0007658 | 0.0010281 |
| text1045 | -0.0151357 | -0.0082455 | 0.0150408 | 0.0014466 |
| text1046 | -0.0200334 | -0.0049669 | 0.0170416 | 0.0019654 |
| text1047 | -0.0253160 | -0.0113686 | 0.0185021 | 0.0002079 |
| text1048 | -0.0251418 | -0.0161469 | 0.0263935 | 0.0027670 |
| text1049 | -0.0179139 | -0.0094574 | 0.0250832 | 0.0091655 |
| text1050 | -0.0099410 | -0.0036277 | 0.0117922 | 0.0030250 |
| text1051 | -0.0419064 | -0.0323624 | 0.0538877 | 0.0078780 |
| text1052 | -0.0203400 | -0.0085750 | 0.0209533 | 0.0041619 |
| text1053 | -0.0209042 | -0.0106663 | 0.0188059 | 0.0041184 |
| text1054 | -0.0199454 | -0.0118466 | 0.0174727 | -0.0017813 |
| text1055 | -0.0276068 | -0.0104652 | 0.0250484 | 0.0029041 |
| text1056 | -0.0170103 | -0.0075140 | 0.0180475 | 0.0030910 |
| text1057 | -0.0010483 | -0.0005873 | 0.0017255 | 0.0002276 |
| text1058 | -0.0185959 | -0.0097532 | 0.0158973 | -0.0001826 |
| text1059 | -0.0133493 | 0.0016092 | 0.0079719 | 0.0005306 |
| text1060 | -0.0234403 | -0.0076870 | 0.0420423 | 0.0069352 |
| text1061 | -0.0171340 | -0.0052601 | 0.0128770 | 0.0005937 |
| text1062 | -0.0139176 | -0.0018282 | 0.0037099 | -0.0024943 |
| text1063 | -0.0117459 | -0.0051565 | 0.0109801 | 0.0032857 |
| text1064 | -0.0142069 | -0.0064186 | 0.0132818 | 0.0021816 |
| text1065 | -0.0135568 | -0.0079545 | 0.0123459 | 0.0015502 |
| text1066 | -0.0272307 | -0.0197886 | 0.0530494 | 0.0075798 |
| text1067 | -0.0377963 | -0.0370024 | 0.1075597 | 0.0183927 |
| text1068 | -0.0291618 | -0.0309225 | 0.0992795 | 0.0193414 |
| text1069 | -0.0150939 | -0.0129664 | 0.0372990 | 0.0088475 |
| text1070 | -0.0179114 | -0.0114680 | 0.0106272 | 0.0025168 |
| text1071 | -0.0136222 | -0.0100518 | 0.0198259 | 0.0035678 |
| text1072 | -0.0216074 | -0.0131234 | 0.0155019 | 0.0065339 |
| text1073 | -0.0212570 | -0.0137254 | 0.0235694 | 0.0043042 |
| text1074 | -0.0040555 | -0.0021911 | 0.0052173 | 0.0012485 |
| text1075 | -0.0214421 | -0.0019089 | 0.0211170 | 0.0125574 |
| text1076 | -0.0197200 | -0.0018771 | 0.0226796 | 0.0107435 |
| text1077 | -0.0151471 | -0.0094570 | 0.0270332 | 0.0066306 |
| text1078 | -0.0207795 | -0.0135142 | 0.0353057 | 0.0062469 |
| text1079 | -0.0113448 | -0.0071943 | 0.0167652 | 0.0021027 |
| text1080 | -0.0253072 | -0.0145377 | 0.0408046 | 0.0098330 |
| text1081 | -0.0236678 | -0.0192688 | 0.0501994 | 0.0116057 |
| text1082 | -0.0199404 | -0.0253949 | 0.0925495 | 0.0207475 |
| text1083 | -0.0151471 | -0.0094570 | 0.0270332 | 0.0066306 |
| text1084 | -0.0207795 | -0.0135142 | 0.0353057 | 0.0062469 |
| text1085 | -0.0113448 | -0.0071943 | 0.0167652 | 0.0021027 |
| text1086 | -0.0253072 | -0.0145377 | 0.0408046 | 0.0098330 |
| text1087 | -0.0236678 | -0.0192688 | 0.0501994 | 0.0116057 |
| text1088 | -0.0199404 | -0.0253949 | 0.0925495 | 0.0207475 |
| text1089 | -0.0094703 | -0.0102072 | 0.0205966 | 0.0054563 |
| text1090 | -0.0285917 | -0.0111176 | 0.0301732 | 0.0010597 |
| text1091 | -0.0201503 | -0.0165776 | 0.0445771 | 0.0083253 |
| text1092 | -0.0229321 | -0.0145344 | 0.0300328 | 0.0078221 |
| text1093 | -0.0202094 | -0.0145203 | 0.0355966 | 0.0065661 |
| text1094 | -0.0242018 | -0.0131876 | 0.0254351 | 0.0035450 |
| text1095 | -0.0067322 | -0.0068213 | 0.0090367 | 0.0015180 |
| text1096 | -0.0173636 | -0.0129301 | 0.0214460 | 0.0042458 |
| text1097 | -0.0320231 | -0.0235083 | 0.0589731 | 0.0121324 |
| text1098 | -0.0237467 | -0.0213508 | 0.0395128 | 0.0036416 |
| text1099 | -0.0169007 | -0.0123195 | 0.0220186 | 0.0059627 |
| text1100 | -0.0126996 | -0.0046020 | 0.0148387 | 0.0031897 |
| text1101 | -0.0122505 | -0.0099574 | 0.0282459 | 0.0077736 |
| text1102 | -0.0162773 | -0.0132139 | 0.0360415 | 0.0075176 |
| text1103 | -0.0214522 | -0.0292719 | 0.0415300 | 0.0063516 |
| text1104 | -0.0319549 | -0.0237327 | 0.0412914 | -0.0012595 |
| text1105 | -0.0245967 | -0.0205498 | 0.0286367 | 0.0061306 |
| text1106 | -0.0304542 | -0.0245625 | 0.0327032 | -0.0067697 |
| text1107 | -0.0247435 | -0.0247196 | 0.0311028 | -0.0032040 |
| text1108 | -0.0257948 | -0.0222642 | 0.0305271 | 0.0013257 |
| text1109 | -0.0262285 | -0.0215731 | 0.0302932 | 0.0009773 |
| text1110 | -0.0252521 | -0.0264942 | 0.0379437 | 0.0049646 |
| text1111 | -0.0192676 | -0.0136515 | 0.0281071 | 0.0009447 |
| text1112 | -0.0347167 | -0.0342337 | 0.0434928 | 0.0004910 |
| text1113 | -0.0244701 | -0.0238497 | 0.0327413 | -0.0063515 |
| text1114 | -0.0269921 | -0.0238286 | 0.0322015 | -0.0073444 |
| text1115 | -0.0377411 | -0.0333112 | 0.0629164 | 0.0001559 |
| text1116 | -0.0368683 | -0.0445966 | 0.0476916 | 0.0210657 |
| text1117 | -0.0259677 | -0.0241687 | 0.0219585 | 0.0026820 |
| text1118 | -0.0301166 | -0.0146272 | 0.0381856 | 0.0052224 |
| text1119 | -0.0162222 | -0.0070305 | 0.0192267 | 0.0030285 |
| text1120 | -0.0145506 | -0.0043810 | 0.0069270 | -0.0007846 |
| text1121 | -0.0174923 | -0.0083459 | 0.0042399 | -0.0041995 |
| text1122 | -0.0184340 | -0.0073860 | 0.0099225 | -0.0045545 |
| text1123 | -0.0296207 | -0.0162231 | 0.0676966 | 0.0100054 |
| text1124 | -0.0305605 | -0.0064434 | 0.0522023 | 0.0066935 |
| text1125 | -0.0274595 | -0.0109308 | 0.0440128 | 0.0060410 |
| text1126 | -0.0296254 | -0.0154951 | 0.0416525 | 0.0058692 |
| text1127 | -0.0287999 | -0.0160668 | 0.0426565 | 0.0016893 |
| text1128 | -0.0324509 | -0.0200455 | 0.0714204 | 0.0054908 |
| text1129 | -0.0158333 | -0.0077993 | 0.0231444 | 0.0040821 |
| text1130 | -0.0166545 | -0.0011147 | 0.0105405 | 0.0087832 |
| text1131 | -0.0270932 | 0.0220911 | 0.0010504 | 0.0151892 |
| text1132 | -0.0161242 | 0.0165804 | -0.0000142 | 0.0060016 |
| text1133 | -0.0225085 | -0.0158844 | 0.0215858 | -0.0027357 |
| text1134 | -0.0236146 | -0.0161793 | 0.0167269 | -0.0038898 |
| text1135 | -0.0227301 | -0.0098292 | 0.0174373 | -0.0051548 |
| text1136 | -0.0227199 | -0.0126223 | 0.0224201 | 0.0015993 |
| text1137 | -0.0192956 | -0.0096321 | 0.0170796 | 0.0006755 |
| text1138 | -0.0085842 | -0.0039003 | 0.0078169 | 0.0004159 |
| text1139 | -0.0274357 | -0.0062567 | 0.0176944 | -0.0012854 |
| text1140 | -0.0243711 | -0.0047832 | 0.0207549 | -0.0017759 |
| text1141 | -0.0238983 | -0.0069998 | 0.0142620 | -0.0031334 |
| text1142 | -0.0174057 | -0.0088408 | 0.0125955 | 0.0023863 |
| text1143 | -0.0242226 | -0.0130990 | 0.0140379 | -0.0032863 |
| text1144 | -0.0285038 | -0.0043321 | 0.0121895 | -0.0028313 |
| text1145 | -0.0064356 | -0.0040815 | 0.0043936 | 0.0025266 |
| text1146 | -0.0262610 | -0.0187197 | 0.0291913 | -0.0009728 |
| text1147 | -0.0167186 | -0.0064467 | 0.0112405 | -0.0007340 |
| text1148 | -0.0207037 | -0.0125794 | 0.0228703 | 0.0029014 |
| text1149 | -0.0129677 | -0.0072709 | 0.0067545 | 0.0023751 |
| text1150 | -0.0187349 | -0.0091253 | 0.0080242 | -0.0066221 |
| text1151 | -0.0124255 | -0.0057169 | 0.0065132 | -0.0007442 |
| text1152 | -0.0201124 | -0.0097322 | 0.0070874 | 0.0008443 |
| text1153 | -0.0196531 | -0.0054270 | 0.0108279 | 0.0007538 |
| text1154 | -0.0179157 | -0.0102538 | 0.0067125 | -0.0059407 |
| text1155 | -0.0113366 | -0.0059591 | 0.0069865 | -0.0042080 |
| text1156 | -0.0062321 | -0.0020855 | 0.0018303 | -0.0020181 |
| text1157 | -0.0235368 | -0.0241941 | 0.0584593 | 0.0124386 |
| text1158 | -0.0244855 | -0.0106466 | 0.0221217 | 0.0053385 |
| text1159 | -0.0131993 | -0.0064533 | 0.0062377 | 0.0015792 |
| text1160 | -0.0122562 | -0.0080524 | 0.0225224 | 0.0060705 |
| text1161 | -0.0049293 | -0.0027934 | 0.0094813 | 0.0021753 |
| text1162 | -0.0211644 | -0.0125356 | 0.0275538 | 0.0085401 |
| text1163 | -0.0190492 | -0.0107053 | 0.0194465 | 0.0058457 |
| text1164 | -0.0225154 | -0.0145326 | 0.0288332 | 0.0076590 |
| text1165 | -0.0171225 | -0.0116729 | 0.0278252 | 0.0040452 |
| text1166 | -0.0244733 | -0.0117538 | 0.0265764 | 0.0098712 |
| text1167 | -0.0117007 | -0.0031526 | 0.0143305 | 0.0045570 |
| text1168 | -0.0186590 | -0.0083812 | 0.0116189 | -0.0017007 |
| text1169 | -0.0238350 | -0.0159615 | 0.0212070 | 0.0031323 |
| text1170 | -0.0125544 | -0.0030595 | 0.0098875 | 0.0021161 |
| text1171 | -0.0252072 | -0.0121408 | 0.0222294 | -0.0036411 |
| text1172 | -0.0189061 | -0.0102335 | 0.0133336 | -0.0010737 |
| text1173 | -0.0079982 | -0.0044934 | 0.0070771 | 0.0009794 |
| text1174 | -0.0186590 | -0.0083812 | 0.0116189 | -0.0017007 |
| text1175 | -0.0238350 | -0.0159615 | 0.0212070 | 0.0031323 |
| text1176 | -0.0125544 | -0.0030595 | 0.0098875 | 0.0021161 |
| text1177 | -0.0252072 | -0.0121408 | 0.0222294 | -0.0036411 |
| text1178 | -0.0189061 | -0.0102335 | 0.0133336 | -0.0010737 |
| text1179 | -0.0079982 | -0.0044934 | 0.0070771 | 0.0009794 |
| text1180 | -0.0238085 | -0.0121498 | 0.0100108 | 0.0051301 |
| text1181 | -0.0191390 | -0.0114405 | 0.0192273 | 0.0013722 |
| text1182 | -0.0191605 | -0.0130434 | 0.0106901 | 0.0023199 |
| text1183 | -0.0120912 | -0.0079547 | 0.0070569 | 0.0010177 |
| text1184 | -0.0215433 | -0.0129933 | 0.0162611 | 0.0040735 |
| text1185 | -0.0190242 | -0.0081960 | 0.0186388 | 0.0028375 |
| text1186 | -0.0099459 | -0.0047660 | 0.0080351 | 0.0015743 |
| text1187 | -0.0281091 | -0.0187626 | 0.0282544 | 0.0051531 |
| text1188 | -0.0149369 | -0.0079684 | 0.0112335 | 0.0032388 |
| text1189 | -0.0281091 | -0.0187626 | 0.0282544 | 0.0051531 |
| text1190 | -0.0149369 | -0.0079684 | 0.0112335 | 0.0032388 |
| text1191 | -0.0231752 | -0.0194658 | 0.0656339 | 0.0108038 |
| text1192 | -0.0227851 | -0.0136126 | 0.0451410 | 0.0034113 |
| text1193 | -0.0198902 | -0.0151360 | 0.0360868 | 0.0067546 |
| text1194 | -0.0228518 | -0.0143274 | 0.0424977 | 0.0054094 |
| text1195 | -0.0195070 | -0.0174073 | 0.0392162 | 0.0043063 |
| text1196 | -0.0218024 | -0.0146727 | 0.0400129 | 0.0066365 |
| text1197 | -0.0158582 | -0.0147543 | 0.0377303 | 0.0030246 |
| text1198 | -0.0232767 | -0.0152794 | 0.0389025 | 0.0046568 |
| text1199 | -0.0535793 | -0.0365869 | 0.0983513 | 0.0174463 |
| text1200 | -0.0253214 | -0.0127435 | 0.0333462 | 0.0033570 |
| text1201 | -0.0396566 | -0.0300656 | 0.0737924 | 0.0151038 |
| text1202 | -0.0098603 | -0.0060615 | 0.0162700 | 0.0029353 |
| text1203 | -0.0242161 | -0.0247548 | 0.0577583 | 0.0068936 |
| text1204 | -0.0144860 | -0.0100654 | 0.0240335 | 0.0032534 |
| text1205 | -0.0129762 | -0.0059343 | 0.0117907 | -0.0007517 |
| text1206 | -0.0113764 | -0.0063911 | 0.0146978 | 0.0040480 |
| text1207 | -0.0102598 | -0.0059454 | 0.0132757 | 0.0054319 |
| text1208 | -0.0173313 | -0.0127241 | 0.0273363 | 0.0052954 |
| text1209 | -0.0133546 | -0.0115297 | 0.0241664 | 0.0050609 |
| text1210 | -0.0124874 | -0.0099186 | 0.0171910 | 0.0012146 |
| text1211 | -0.0110944 | -0.0064187 | 0.0090577 | 0.0018846 |
| text1212 | -0.0040175 | -0.0018344 | 0.0023126 | -0.0002280 |
| text1213 | -0.0185136 | -0.0030719 | 0.0264387 | 0.0148726 |
| text1214 | -0.0159681 | -0.0057147 | 0.0195049 | 0.0110946 |
| text1215 | -0.0132846 | -0.0078471 | 0.0112509 | 0.0070811 |
| text1216 | -0.0259760 | -0.0141900 | 0.0243252 | 0.0053645 |
| text1217 | -0.0223435 | -0.0125704 | 0.0208375 | 0.0016063 |
| text1218 | -0.0235111 | -0.0133797 | 0.0268141 | 0.0079958 |
| text1219 | -0.0256192 | -0.0080074 | 0.0307764 | 0.0099518 |
| text1220 | -0.0354537 | -0.0196347 | 0.0526016 | 0.0044227 |
| text1221 | -0.0076849 | -0.0044281 | 0.0065313 | 0.0026662 |
| text1222 | -0.0331396 | -0.0191415 | 0.0480987 | -0.0022103 |
| text1223 | -0.0151661 | -0.0119982 | 0.0226967 | -0.0031261 |
| text1224 | -0.0267368 | -0.0213433 | 0.0396613 | 0.0074942 |
| text1225 | -0.0211184 | -0.0108045 | 0.0263006 | 0.0066452 |
| text1226 | -0.0126821 | -0.0097052 | 0.0191086 | 0.0068496 |
| text1227 | -0.0185914 | -0.0167589 | 0.0322808 | 0.0072619 |
| text1228 | -0.0316281 | -0.0239423 | 0.0446281 | 0.0112825 |
| text1229 | -0.0201399 | -0.0165777 | 0.0390141 | 0.0103380 |
| text1230 | -0.0184581 | -0.0108156 | 0.0234617 | 0.0076415 |
| text1231 | -0.0237889 | -0.0157002 | 0.0282077 | 0.0117356 |
| text1232 | -0.0097602 | -0.0092773 | 0.0174648 | 0.0051544 |
| text1233 | -0.0386951 | -0.0187885 | 0.0905965 | 0.0210118 |
| text1234 | -0.0445560 | -0.0346257 | 0.1287691 | 0.0256585 |
| text1235 | -0.0450785 | -0.0316707 | 0.1044005 | 0.0273610 |
| text1236 | -0.0248358 | -0.0102612 | 0.0404128 | 0.0062831 |
| text1237 | -0.0239457 | -0.0109814 | 0.0187056 | -0.0008420 |
| text1238 | -0.0241894 | -0.0135012 | 0.0230123 | 0.0038015 |
| text1239 | -0.0056791 | 0.0004618 | 0.0034132 | 0.0012086 |
| text1240 | -0.0203604 | -0.0127864 | 0.0261125 | 0.0077190 |
| text1241 | -0.0172091 | -0.0058820 | 0.0180481 | 0.0003501 |
| text1242 | -0.0221294 | -0.0080191 | 0.0197600 | -0.0018498 |
| text1243 | -0.0148584 | -0.0117424 | 0.0197293 | 0.0010147 |
| text1244 | -0.0155019 | -0.0087822 | 0.0227989 | 0.0025723 |
| text1245 | -0.0031622 | -0.0023090 | 0.0035140 | 0.0010601 |
| text1246 | -0.0169609 | -0.0072296 | 0.0121245 | 0.0010414 |
| text1247 | -0.0152729 | -0.0121264 | 0.0113229 | 0.0046816 |
| text1248 | -0.0235211 | -0.0119081 | 0.0203465 | -0.0015821 |
| text1249 | -0.0107376 | -0.0043747 | 0.0084898 | -0.0001421 |
| text1250 | -0.0202073 | -0.0117020 | 0.0171877 | 0.0000047 |
| text1251 | -0.0161262 | -0.0066680 | 0.0133748 | -0.0015715 |
| text1252 | -0.0126374 | -0.0109898 | 0.0169301 | 0.0022939 |
| text1253 | -0.0094042 | -0.0062960 | 0.0063829 | 0.0012226 |
| text1254 | -0.0097084 | -0.0055958 | 0.0051556 | -0.0019282 |
| text1255 | -0.0121109 | -0.0025475 | 0.0089829 | -0.0010848 |
| text1256 | -0.0168702 | -0.0075219 | 0.0238480 | 0.0064874 |
| text1257 | -0.0238096 | -0.0161113 | 0.0331021 | 0.0016966 |
| text1258 | -0.0238813 | -0.0115240 | 0.0260739 | 0.0015260 |
| text1259 | -0.0180455 | -0.0110059 | 0.0193329 | 0.0028770 |
| text1260 | -0.0137165 | -0.0092296 | 0.0142035 | 0.0010840 |
| text1261 | -0.0162348 | -0.0105286 | 0.0211166 | 0.0001892 |
| text1262 | -0.0001354 | -0.0002113 | 0.0005279 | -0.0000709 |
| text1263 | -0.0213465 | -0.0051867 | 0.0197088 | 0.0001245 |
| text1264 | -0.0184797 | 0.0006751 | 0.0139339 | 0.0029983 |
| text1265 | -0.0139337 | -0.0078511 | 0.0118389 | 0.0029084 |
| text1266 | -0.0145363 | 0.0023955 | 0.0126649 | 0.0067576 |
| text1267 | -0.0231630 | -0.0084126 | 0.0196143 | 0.0042069 |
| text1268 | -0.0186263 | -0.0033988 | 0.0079080 | -0.0013910 |
| text1269 | -0.0196692 | -0.0003400 | 0.0089264 | 0.0017387 |
| text1270 | -0.0298738 | -0.0060255 | 0.0145928 | -0.0028674 |
| text1271 | -0.0255825 | -0.0069126 | 0.0162168 | 0.0056273 |
| text1272 | -0.0183746 | -0.0072260 | 0.0118741 | 0.0022930 |
| text1273 | -0.0224918 | -0.0058505 | 0.0161267 | 0.0084872 |
| text1274 | -0.0187939 | -0.0123916 | 0.0119268 | -0.0000274 |
| text1275 | -0.0250872 | -0.0154157 | 0.0147803 | -0.0003747 |
| text1276 | -0.0084241 | -0.0042067 | 0.0048075 | 0.0006028 |
| text1277 | -0.0266410 | -0.0263991 | 0.0700499 | 0.0159152 |
| text1278 | -0.0235953 | -0.0185272 | 0.0405770 | 0.0094306 |
| text1279 | -0.0353413 | -0.0343207 | 0.0782522 | 0.0191723 |
| text1280 | -0.0243772 | -0.0216084 | 0.0527765 | 0.0088372 |
| text1281 | -0.0218127 | -0.0106865 | 0.0212158 | -0.0005694 |
| text1282 | -0.0178868 | -0.0063653 | 0.0170606 | 0.0000369 |
| text1283 | -0.0181415 | -0.0060938 | 0.0161015 | -0.0000281 |
| text1284 | -0.0196875 | -0.0113688 | 0.0269542 | 0.0042661 |
| text1285 | -0.0224882 | -0.0093405 | 0.0223077 | 0.0025204 |
| text1286 | -0.0123197 | -0.0053514 | 0.0147080 | 0.0031464 |
| text1287 | -0.0295005 | -0.0029757 | 0.0453603 | 0.0131348 |
| text1288 | -0.0228497 | -0.0118634 | 0.0299708 | 0.0060066 |
| text1289 | -0.0239658 | -0.0097514 | 0.0337060 | 0.0116057 |
| text1290 | -0.0167061 | 0.0004961 | 0.0088835 | 0.0050500 |
| text1291 | -0.0194456 | -0.0114314 | 0.0230972 | 0.0059780 |
| text1292 | -0.0199051 | -0.0065633 | 0.0276749 | 0.0065087 |
| text1293 | -0.0164042 | -0.0089796 | 0.0250220 | 0.0043664 |
| text1294 | -0.0292774 | -0.0259460 | 0.0636581 | 0.0151690 |
| text1295 | -0.0214477 | -0.0112708 | 0.0429149 | 0.0117623 |
| text1296 | -0.0310994 | -0.0175278 | 0.0720504 | 0.0157519 |
| text1297 | -0.0328056 | -0.0313461 | 0.0822017 | 0.0179847 |
| text1298 | -0.0345539 | -0.0252388 | 0.0733985 | 0.0120810 |
| text1299 | -0.0257661 | -0.0198282 | 0.0459608 | 0.0083313 |
| text1300 | -0.0147166 | -0.0090563 | 0.0233870 | 0.0062795 |
| text1301 | -0.0263456 | -0.0156293 | 0.0255102 | 0.0005722 |
| text1302 | -0.0331501 | -0.0167364 | 0.0434939 | 0.0041830 |
| text1303 | -0.0276590 | -0.0128778 | 0.0334430 | 0.0068494 |
| text1304 | -0.0249355 | -0.0151158 | 0.0301837 | 0.0039432 |
| text1305 | -0.0203486 | -0.0148969 | 0.0205627 | -0.0003576 |
| text1306 | -0.0199036 | -0.0074581 | 0.0176603 | 0.0003516 |
| text1307 | -0.0141087 | -0.0034968 | 0.0107500 | 0.0018270 |
| text1308 | -0.0110963 | -0.0047028 | 0.0044790 | 0.0017730 |
| text1309 | -0.0148529 | -0.0022568 | 0.0078490 | 0.0036055 |
| text1310 | -0.0163102 | -0.0072631 | 0.0080936 | 0.0012271 |
| text1311 | -0.0190380 | -0.0121390 | 0.0124352 | -0.0004337 |
| text1312 | -0.0148717 | -0.0060666 | 0.0131142 | 0.0016983 |
| text1313 | -0.0146381 | -0.0066988 | 0.0142692 | 0.0050205 |
| text1314 | -0.0164621 | -0.0049447 | 0.0100225 | 0.0025445 |
| text1315 | -0.0027845 | -0.0011945 | 0.0010697 | -0.0001716 |
| text1316 | -0.0220135 | -0.0101771 | 0.0200871 | 0.0054067 |
| text1317 | -0.0191518 | -0.0149883 | 0.0391383 | 0.0081219 |
| text1318 | -0.0125505 | -0.0086241 | 0.0191920 | 0.0031596 |
| text1319 | -0.0158330 | -0.0147393 | 0.0266364 | 0.0046906 |
| text1320 | -0.0162192 | -0.0103692 | 0.0177857 | 0.0029235 |
| text1321 | -0.0023085 | -0.0005858 | 0.0022416 | 0.0010583 |
| text1322 | -0.0285936 | -0.0179461 | 0.0333453 | 0.0092403 |
| text1323 | -0.0334368 | -0.0152622 | 0.0343416 | 0.0107335 |
| text1324 | -0.0325993 | -0.0128034 | 0.0211156 | -0.0007863 |
| text1325 | -0.0150414 | -0.0048147 | 0.0114505 | 0.0037322 |
| text1326 | -0.0300606 | -0.0097281 | 0.0596521 | 0.0098124 |
| text1327 | -0.0094376 | -0.0008894 | 0.0117267 | 0.0007032 |
| text1328 | -0.0154628 | -0.0104481 | 0.0215812 | 0.0058727 |
| text1329 | -0.0171304 | -0.0075659 | 0.0115213 | 0.0060447 |
| text1330 | -0.0222916 | -0.0071053 | 0.0118147 | 0.0058899 |
| text1331 | -0.0236364 | -0.0145963 | 0.0350382 | 0.0061551 |
| text1332 | -0.0288623 | -0.0229157 | 0.0721740 | 0.0151759 |
| text1333 | -0.0258637 | -0.0138873 | 0.0433895 | 0.0110590 |
| text1334 | -0.0023720 | -0.0012391 | 0.0008529 | -0.0005517 |
| text1335 | -0.0156818 | -0.0023940 | 0.0061661 | 0.0004349 |
| text1336 | -0.0134877 | 0.0049014 | 0.0054709 | 0.0066651 |
| text1337 | -0.0214546 | -0.0074741 | 0.0090005 | 0.0100370 |
| text1338 | -0.0203300 | -0.0108516 | 0.0054916 | 0.0151831 |
| text1339 | -0.0046561 | 0.0002105 | 0.0034609 | 0.0007325 |
| text1340 | -0.0208681 | -0.0131218 | 0.0194035 | 0.0046350 |
| text1341 | -0.0099517 | -0.0011642 | 0.0078330 | 0.0030831 |
| text1342 | -0.0274237 | -0.0111208 | 0.0221667 | 0.0096249 |
| text1343 | -0.0104811 | 0.0034198 | 0.0047793 | -0.0002065 |
| text1344 | -0.0183651 | -0.0108382 | 0.0050208 | 0.0001450 |
| text1345 | -0.0168909 | -0.0094141 | 0.0091423 | 0.0047790 |
| text1346 | -0.0101880 | -0.0023519 | 0.0062977 | -0.0001711 |
| text1347 | -0.0060476 | -0.0025775 | 0.0003926 | -0.0025569 |
| text1348 | -0.0199178 | -0.0149241 | 0.0510025 | 0.0112930 |
| text1349 | -0.0207668 | -0.0188404 | 0.0484058 | 0.0084296 |
| text1350 | -0.0250540 | -0.0185392 | 0.0384077 | 0.0092102 |
| text1351 | -0.0155849 | -0.0124551 | 0.0322648 | 0.0077263 |
| text1352 | -0.0091374 | -0.0058421 | 0.0166754 | 0.0026362 |
| text1353 | -0.0262845 | -0.0202025 | 0.0626862 | 0.0147606 |
| text1354 | -0.0121905 | -0.0077810 | 0.0167179 | 0.0065283 |
| text1355 | -0.0175985 | -0.0129004 | 0.0263908 | 0.0048875 |
| text1356 | -0.0181389 | -0.0165556 | 0.0477068 | 0.0085469 |
| text1357 | -0.0180117 | -0.0201083 | 0.0517978 | 0.0115285 |
| text1358 | -0.0314476 | -0.0075955 | 0.0414872 | 0.0076994 |
| text1359 | -0.0190992 | -0.0092232 | 0.0322768 | 0.0035414 |
| text1360 | -0.0255813 | -0.0054276 | 0.0355909 | 0.0059357 |
| text1361 | -0.0274213 | -0.0008209 | 0.0365301 | 0.0055831 |
| text1362 | -0.0086179 | -0.0044916 | 0.0140517 | 0.0032662 |
| text1363 | -0.0176532 | -0.0071213 | 0.0115839 | 0.0030242 |
| text1364 | -0.0160607 | -0.0076934 | 0.0121910 | 0.0022804 |
| text1365 | -0.0196449 | -0.0077007 | 0.0148938 | 0.0033588 |
| text1366 | -0.0143716 | -0.0087438 | 0.0191285 | 0.0056031 |
| text1367 | -0.0196434 | -0.0094279 | 0.0222537 | 0.0034856 |
| text1368 | -0.0086982 | -0.0048728 | 0.0151825 | 0.0027900 |
| text1369 | -0.0045530 | -0.0027982 | 0.0062111 | 0.0006941 |
| text1370 | -0.0219227 | -0.0196646 | 0.0472492 | 0.0074813 |
| text1371 | -0.0208764 | -0.0124188 | 0.0201754 | 0.0009754 |
| text1372 | -0.0045387 | -0.0036443 | 0.0089064 | 0.0005189 |
| text1373 | -0.0279274 | -0.0133612 | 0.0196186 | 0.0026106 |
| text1374 | -0.0185426 | -0.0114527 | 0.0168776 | 0.0020348 |
| text1375 | -0.0186296 | -0.0098700 | 0.0178121 | 0.0036093 |
| text1376 | -0.0256559 | -0.0123573 | 0.0250706 | 0.0053335 |
| text1377 | -0.0341733 | -0.0145620 | 0.0323973 | 0.0095320 |
| text1378 | -0.0013368 | -0.0000752 | 0.0007296 | -0.0003470 |
| text1379 | -0.0275008 | -0.0186571 | 0.0350006 | 0.0157030 |
| text1380 | -0.0227336 | -0.0122630 | 0.0312561 | 0.0074033 |
| text1381 | -0.0271434 | -0.0119445 | 0.0270816 | 0.0070745 |
| text1382 | -0.0289924 | -0.0193119 | 0.0501115 | 0.0134813 |
| text1383 | -0.0147569 | -0.0085744 | 0.0235488 | 0.0052632 |
| text1384 | -0.0174883 | -0.0082715 | 0.0119206 | 0.0071318 |
| text1385 | -0.0172456 | -0.0076378 | 0.0140487 | 0.0036910 |
| text1386 | -0.0186192 | -0.0100199 | 0.0154112 | 0.0053159 |
| text1387 | -0.0207576 | -0.0109796 | 0.0239896 | 0.0121479 |
| text1388 | -0.0265772 | -0.0073118 | 0.0224329 | 0.0154615 |
| text1389 | -0.0137747 | -0.0027725 | 0.0091263 | 0.0036001 |
| text1390 | -0.0092731 | -0.0030433 | 0.0046728 | 0.0026771 |
| text1391 | -0.0320523 | -0.0344569 | 0.1407007 | 0.0186013 |
| text1392 | -0.0185421 | -0.0243537 | 0.0970342 | 0.0165547 |
| text1393 | -0.0193515 | -0.0220953 | 0.0762141 | 0.0093244 |
| text1394 | -0.0242607 | -0.0313627 | 0.1174021 | 0.0181020 |
| text1395 | -0.0199014 | -0.0195033 | 0.0611362 | 0.0101504 |
| text1396 | -0.0263855 | -0.0275125 | 0.0847913 | 0.0150223 |
| text1397 | -0.0152298 | -0.0237583 | 0.0745102 | 0.0106841 |
| text1398 | -0.0146808 | -0.0170921 | 0.0511883 | 0.0065677 |
| text1399 | -0.0107480 | -0.0099173 | 0.0198269 | 0.0067846 |
| text1400 | -0.0168060 | -0.0170032 | 0.0804844 | 0.0108055 |
| text1401 | -0.0080115 | -0.0059210 | 0.0181123 | 0.0012585 |
| text1402 | -0.0172274 | -0.0193586 | 0.0644531 | 0.0112668 |
| text1403 | -0.0075528 | -0.0063734 | 0.0195733 | 0.0043290 |
| text1404 | -0.0241621 | -0.0196769 | 0.0095680 | 0.0134103 |
| text1405 | -0.0192720 | -0.0172307 | 0.0120371 | 0.0077019 |
| text1406 | -0.0201197 | -0.0140707 | 0.0166962 | 0.0091761 |
| text1407 | -0.0256287 | -0.0199207 | 0.0216180 | 0.0112929 |
| text1408 | -0.0198045 | -0.0180369 | 0.0357508 | 0.0090264 |
| text1409 | -0.0214027 | -0.0133113 | 0.0381165 | 0.0080108 |
| text1410 | -0.0099786 | -0.0057152 | 0.0147671 | 0.0014337 |
| text1411 | -0.0388965 | -0.0293422 | 0.0615207 | 0.0078692 |
| text1412 | -0.0165393 | -0.0122713 | 0.0248021 | 0.0062065 |
| text1413 | -0.0253738 | -0.0132514 | 0.0361920 | 0.0095223 |
| text1414 | -0.0209431 | -0.0159577 | 0.0362313 | 0.0042816 |
| text1415 | -0.0236067 | -0.0173798 | 0.0272316 | 0.0035329 |
| text1416 | -0.0368629 | -0.0263240 | -0.0513811 | -0.0950757 |
| text1417 | -0.0259037 | -0.0219388 | -0.0167730 | -0.0547199 |
| text1418 | -0.0368457 | -0.0280660 | -0.0328888 | -0.0797909 |
| text1419 | -0.0361311 | -0.0085427 | -0.0430633 | -0.0817451 |
| text1420 | -0.0071000 | -0.0051373 | -0.0085707 | -0.0121793 |
| text1421 | -0.0368629 | -0.0263240 | -0.0513811 | -0.0950757 |
| text1422 | -0.0259037 | -0.0219388 | -0.0167730 | -0.0547199 |
| text1423 | -0.0368457 | -0.0280660 | -0.0328888 | -0.0797909 |
| text1424 | -0.0361311 | -0.0085427 | -0.0430633 | -0.0817451 |
| text1425 | -0.0071000 | -0.0051373 | -0.0085707 | -0.0121793 |
| text1426 | -0.0136770 | -0.0056069 | 0.0091418 | 0.0027283 |
| text1427 | -0.0234082 | -0.0096756 | 0.0297016 | 0.0122550 |
| text1428 | -0.0201513 | -0.0022685 | 0.0193452 | 0.0033832 |
| text1429 | -0.0109535 | -0.0037946 | 0.0085212 | -0.0008822 |
| text1430 | -0.0246360 | -0.0213953 | 0.0665742 | 0.0141814 |
| text1431 | -0.0216925 | -0.0162035 | 0.0521358 | 0.0117723 |
| text1432 | -0.0221570 | -0.0197074 | 0.0270826 | 0.0183850 |
| text1433 | -0.0085360 | -0.0061643 | 0.0138918 | 0.0040306 |
| text1434 | -0.0290395 | -0.0183560 | 0.0358862 | 0.0094803 |
| text1435 | -0.0128230 | -0.0091225 | 0.0188358 | 0.0049251 |
| text1436 | -0.0226302 | -0.0146832 | 0.0381589 | 0.0072173 |
| text1437 | -0.0226699 | -0.0110577 | 0.0294877 | 0.0063232 |
| text1438 | -0.0343363 | -0.0083079 | 0.0286105 | 0.0070472 |
| text1439 | -0.0257373 | -0.0141723 | 0.0234504 | 0.0007082 |
| text1440 | -0.0392825 | -0.0226554 | 0.0629511 | 0.0055264 |
| text1441 | -0.0172997 | -0.0113387 | 0.0245082 | 0.0021510 |
| text1442 | -0.0250266 | -0.0163937 | 0.0349318 | 0.0082551 |
| text1443 | -0.0233034 | -0.0117810 | 0.0307270 | 0.0028201 |
| text1444 | -0.0262489 | -0.0152228 | 0.0354920 | 0.0063862 |
| text1445 | -0.0324450 | -0.0175934 | 0.0350503 | 0.0038955 |
| text1446 | -0.0303233 | -0.0159270 | 0.0375620 | 0.0043827 |
| text1447 | -0.0141617 | -0.0080686 | 0.0181165 | 0.0013674 |
| text1448 | -0.0155422 | -0.0057841 | 0.0114764 | 0.0041780 |
| text1449 | -0.0252204 | -0.0154459 | 0.0254361 | 0.0061764 |
| text1450 | -0.0177621 | -0.0080365 | 0.0234353 | 0.0076322 |
| text1451 | -0.0179631 | -0.0098150 | 0.0208695 | 0.0018190 |
| text1452 | -0.0081377 | -0.0039232 | 0.0077366 | 0.0017474 |
| text1453 | -0.0127695 | -0.0093685 | 0.0206262 | 0.0038767 |
| text1454 | -0.0210029 | -0.0148554 | 0.0205513 | 0.0027548 |
| text1455 | -0.0111173 | -0.0100176 | 0.0172857 | 0.0038602 |
| text1456 | -0.0176743 | -0.0147119 | 0.0336330 | 0.0073582 |
| text1457 | -0.0038202 | -0.0033638 | 0.0078626 | 0.0019853 |
| text1458 | -0.0214738 | -0.0107076 | 0.0219580 | 0.0078269 |
| text1459 | -0.0238631 | -0.0169482 | 0.0258979 | 0.0073586 |
| text1460 | -0.0157018 | -0.0087490 | 0.0120742 | -0.0013495 |
| text1461 | -0.0316158 | -0.0141059 | 0.0248640 | 0.0030479 |
| text1462 | -0.0225950 | -0.0069671 | 0.0189227 | -0.0013091 |
| text1463 | -0.0067600 | -0.0036397 | 0.0049023 | 0.0002011 |
| text1464 | -0.0127280 | -0.0025830 | 0.0106006 | 0.0011827 |
| text1465 | -0.0290737 | -0.0100324 | 0.0204337 | 0.0101260 |
| text1466 | -0.0162131 | -0.0089068 | 0.0189332 | 0.0077427 |
| text1467 | -0.0222891 | -0.0097112 | 0.0177865 | 0.0055085 |
| text1468 | -0.0480732 | -0.0210247 | 0.0152098 | -0.0214661 |
| text1469 | -0.0029752 | -0.0017168 | 0.0042393 | 0.0007534 |
| text1470 | -0.0296134 | -0.0113073 | 0.0301954 | -0.0039040 |
| text1471 | -0.0101514 | -0.0033470 | 0.0048832 | -0.0014561 |
This Doc-topic sim. table shows the link between each text and each topic. For example, text3 most relevant to dimension 3(topic 3).
Topic_Strength2 <- data.frame(CN,TED.lsa2$sk)
kable(Topic_Strength2,
col.names= c("Dimension","Topic strength"),
caption = "Topic strength(LSA on TF-IDF)")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| Dimension | Topic strength |
|---|---|
| dimension1 | 148.71774 |
| dimension2 | 92.26676 |
| dimension3 | 83.34253 |
| dimension4 | 79.22164 |
This Topic strength table represent the strength of each dimension(topic). For example, dimension 4 has the smallest strength.
kable(head(TED.lsa2$features,10),
col.names = c("dimension1","dimension2","dimension3","dimension4"),
caption = "Terms-topic sim.(LSA on TF-IDF)") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| dimension1 | dimension2 | dimension3 | dimension4 | |
|---|---|---|---|---|
| today | -0.0536687 | 0.0095691 | -0.0068271 | -0.0241286 |
| artificial | -0.0415968 | -0.0378650 | -0.0482462 | -0.0482060 |
| intelligence | -0.0645813 | -0.0679693 | -0.0779669 | -0.0735150 |
| help | -0.0327216 | -0.0080277 | 0.0110845 | -0.0074941 |
| doctor | -0.0239033 | -0.0187716 | 0.0082665 | -0.0113366 |
| diagnose | -0.0133458 | -0.0141806 | -0.0075820 | -0.0165815 |
| patient | -0.0277309 | -0.0236963 | -0.0039323 | -0.0299937 |
| pilot | -0.0067898 | 0.0018689 | -0.0036318 | 0.0001526 |
| fly | -0.0204832 | -0.0025275 | -0.0187381 | 0.0240834 |
| commercial | -0.0072157 | 0.0025301 | -0.0057654 | -0.0052020 |
This Terms-topic sim. table shows the link between each term and each topic. For example, the terms “artificial” and “intelligence” are both most relevant to dimension 3(topic 3).
We also check the top words for dimension2, 3, and 4 of LSA on TF-IDF.
## For Dimension 2
w2.order <- sort(TED.lsa2$features[, 2],decreasing = TRUE)
w2.top.d2 <- c(w2.order[1:n.terms],rev(rev(w2.order)[1:n.terms]))
## For Dimension 3
w2.order <- sort(TED.lsa2$features[, 3], decreasing = TRUE)
w2.top.d3 <- c(w2.order[1:n.terms], rev(rev(w2.order)[1:n.terms]))
## For Dimension 4
w2.order <- sort(TED.lsa2$features[, 4], decreasing = TRUE)
w2.top.d4 <- c(w2.order[1:n.terms], rev(rev(w2.order)[1:n.terms]))
kable(w2.top.d2,
col.names = "value",
caption = "Dimension 2(LSA on TF-IDF)")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| value | |
|---|---|
| forest | 0.2176291 |
| carbon | 0.2084673 |
| climate | 0.1770268 |
| emission | 0.1768026 |
| energy | 0.1434831 |
| human | -0.0782871 |
| computer | -0.0820833 |
| machine | -0.0828424 |
| ai | -0.1643167 |
| robot | -0.2829436 |
For this LSA, dimension 2 is associated positively with word like “forest”, “carbon”, “climate”, “emission” ,“energy”, and negatively associated with “human”, “computer”, “machine”, “ai”, “robot”.
kable(w2.top.d3,
col.names = "value",
caption = "Dimension 3(LSA on TF-IDF)")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| value | |
|---|---|
| regret | 0.2568698 |
| sex | 0.2312091 |
| woman | 0.1600525 |
| love | 0.1539985 |
| man | 0.1155972 |
| datum | -0.0791178 |
| machine | -0.0826875 |
| rule | -0.0872609 |
| ai | -0.2548430 |
| robot | -0.4413087 |
Dimension 3 is associated positively with word like “regret”, “sex”, “woman”, “love” ,“man”, and negatively associated with “datum”, “machine”, “rule”, “ai”, “robot”.
kable(w2.top.d4,
col.names = "value",
caption = "Dimension 4(LSA on TF-IDF)")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| value | |
|---|---|
| robot | 0.6214054 |
| rule | 0.1322063 |
| bee | 0.1247338 |
| seaweed | 0.1094784 |
| coral | 0.1068444 |
| machine | -0.0782485 |
| human | -0.0878439 |
| company | -0.1099502 |
| datum | -0.1467530 |
| ai | -0.4063409 |
Dimension 4 is associated positively with word like “robot”, “rule”, “bee”, “seaweed” ,“coral”, and negatively associated with “machine”, “human”, “company”, “datum”, “ai”.
We also check the relation between this LSA result and category of text, we combine the LSA result with the category of document and represent every text on these two following plots.
TED.lsa2.source <- TED_full %>%
select(2) %>% cbind(as.data.frame(TED.lsa2$docs))
LSA_p3 <- ggplot(data=TED.lsa2.source,mapping = aes(
x=V2,
y=V3,
color=cate))+
geom_point()+
labs(x = "dimension2",
y = "dimension3",
title = "Distribution of texts in different category",
subtitle = "LSA(TF-IDF) dimension 2 and 3")+
scale_colour_discrete(
name="Category",
breaks=c("1","2","3"),
labels=c("AI","Climate change","Relationships"))+
theme(plot.title = element_text(size = 12))
LSA_p4 <- ggplot(data=TED.lsa2.source,mapping = aes(
x=V3,
y=V4,
color=cate))+
geom_point()+
labs(x = "dimension3",
y = "dimension4",
title = "Distribution of texts in different category",
subtitle = "LSA(TF-IDF) dimension 3 and 4")+
scale_colour_discrete(
name="Category",
breaks=c("1","2","3"),
labels=c("AI","Climate change","Relationships"))+
theme(plot.title = element_text(size = 12))
(LSA_p3+LSA_p4)+
plot_layout(guides = "collect") & theme(legend.position = 'bottom')
Left plot:x-axis is dimension 2 and y-axis is dimension3. According to this plot, most of the texts of category “Climate change” are positively associated with dimension2. Most of the texts of category “Relationships” are positively associated with dimension3. And most of the category “AI” are negatively associated with dimension2 and dimension3.
Right plot:x-axis is dimension 3 and y-axis is dimension4. According to this plot, It appears that a significant portion of the texts in the “AI” category are associated with dimension 4, although there is some variability in the direction of this association. It is not immediately clear from this plot alone what may be driving this pattern
We now turn to Latent Dirichlet Association (LDA). LDA is a Bayesian model for topic modeling: generative model. It is also to discover topics in a collection of documents. For the illustration, we will make 4 topics again.
TED.LDA <- LDA(
convert(TED.dfm, to = "topicmodels"),
k = 4,
control = list(seed = 123))
First, we examined the top 5 words in each dimension. For example, the top 5 terms for topic 1 are “climate”, “year”, “make”, “change” and “energy”.
kable(topicmodels::terms(TED.LDA, 5),
caption = "Top5 terms for each topic")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| Topic 1 | Topic 2 | Topic 3 | Topic 4 |
|---|---|---|---|
| climate | people | people | robot |
| year | human | love | thing |
| make | ai | feel | time |
| change | make | life | make |
| energy | thing | thing | brain |
Then, we created a table to show the number of documents in each dimension. For example, topic 3 has the highest number of documents(439).
topicmodels::topics(TED.LDA)%>%
table() %>%
kable(caption = "Top5 terms for each topic",
col.names = c("Topic", "number of documents")) %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| Topic | number of documents |
|---|---|
| 1 | 316 |
| 2 | 395 |
| 3 | 439 |
| 4 | 321 |
Then we applied the topic_diagnostics function to diagnose the prominence, coherence and exclusivity of each dimension.
td <- topic_diagnostics(
topic_model = TED.LDA,
dtm_data = convert(TED.dfm, to = "topicmodels"))
kable(td,
caption = "Topic diagnostics")%>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "200px")
| topic_num | topic_size | mean_token_length | dist_from_corpus | tf_df_dist | doc_prominence | topic_coherence | topic_exclusivity |
|---|---|---|---|---|---|---|---|
| 1 | 3383.187 | 5.6 | 0.4216740 | 19.70057 | 418 | -86.57526 | 8.904578 |
| 2 | 3644.634 | 5.1 | 0.3793068 | 22.85747 | 541 | -73.97054 | 8.708857 |
| 3 | 4124.771 | 5.2 | 0.3999282 | 21.41908 | 574 | -77.63556 | 8.381961 |
| 4 | 3892.408 | 4.4 | 0.4113173 | 21.01260 | 435 | -67.67438 | 8.289089 |
Based on the analysis conducted, it appears that Topic 3 has the highest prominence among the identified topics. Additionally, the coherence of Topic 4 was found to be the highest, while the coherence of Topic 1 was the lowest. In terms of exclusivity, it was observed that Topic 1 had the highest exclusivity, while Topic 4 had the lowest exclusivity. These findings suggest that the characteristics of the identified topics vary in terms of their prominence, coherence, and exclusivity.
beta.long <- tidy(
TED.LDA,
matrix = "beta") # equivalent to melt (with this package)
beta.long %>%
group_by(topic) %>%
top_n(15, beta) %>%
ggplot(aes(reorder_within(term, beta, topic), beta)) +
geom_col(show.legend = FALSE) +
coord_flip()+
facet_wrap(~ topic, scales = "free_y") +
scale_x_reordered() +
xlab("Term") +
theme(
axis.text.y = element_text(size = 8),
axis.text.x = element_text(size = 8),
strip.text = element_text(size = 8))
Topic1 focus on terms like “climate”, “change”, “energy”, “water”.
Topic2 focus on terms like”people”, “ai”, “work”, “technology”. Topic3
focus on terms like”love”, “life”, “woman”, “relationship”. Topic4 focus
on terms like “robot”, “thing”, “brain”, “human”.
document <- rownames(TED.lsa.source)
TED.lsa.source <- cbind(document,TED.lsa.source)
gamma.long <- tidy(TED.LDA,matrix = "gamma") %>%
right_join(TED.lsa.source[1:2],by = "document")
gamma.long$cate<-factor(gamma.long$cate,
levels = c('1','2','3'),
labels = c("AI","Climate change","Relationships"))
gamma.long %>% ggplot(mapping = aes(x=document,y=gamma,fill=cate))+
geom_col()+
coord_flip() +
facet_wrap(~topic,ncol = 4)
The charts above show that the Climate change related documents mainly talk about Topic 1. The Relationships related documents mainly talk about Topic3. The AI related documents mainly talk about Topic 2 and Topic4.
In addition to using Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA), we also plan to utilize word and document embeddings to analyze the transcripts of TED videos. Embedding refers to the representation of elements (e.g., documents or tokens) in a vector space model, where words are first embedded and then document embeddings are constructed that capture the co-occurrence patterns of the words within the document.
The objective of word embedding is to learn a representation of words that reflects their co-occurrence patterns. To obtain these co-occurrence patterns, we will utilize the fcm function from the quanteta package.
TED.coo <- fcm(TED.tk,
context = "window",
window = 5,
tri = FALSE)
TC <- head(TED.coo) %>%
convert(to="data.frame") %>%
select(1:6)
kable(TC,
caption = "co-occurrence matrix") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| doc_id | today | artificial | intelligence | diagnose | fly |
|---|---|---|---|---|---|
| today | 28 | 8 | 8 | 4 | 5 |
| artificial | 8 | 30 | 170 | 2 | 2 |
| intelligence | 8 | 170 | 78 | 1 | 0 |
| diagnose | 4 | 2 | 1 | 0 | 1 |
| fly | 5 | 2 | 0 | 1 | 8 |
| commercial | 0 | 0 | 0 | 1 | 2 |
Below, we present a sample of the resulting co-occurrence data. As can be seen, the co-occurrence between the words artificial and intelligence is relatively high (170), while the co-occurrence between the words fly and intelligence is considerably lower (0). These patterns are indicative of the relationships between the words in the corpus and will be useful in constructing the word embeddings.
set.seed(123)
p <- 2 # word embedding dimension
TED.glove <- GlobalVectors$new(rank = p,
x_max = 10) # x_max is a needed technical option
TED.we <- TED.glove$fit_transform(TED.coo) # central vectors; speech.glove$components contains the context vectors
## INFO [23:48:43.736] epoch 1, loss 0.0446
## INFO [23:48:44.294] epoch 2, loss 0.0368
## INFO [23:48:44.597] epoch 3, loss 0.0346
## INFO [23:48:44.854] epoch 4, loss 0.0335
## INFO [23:48:45.076] epoch 5, loss 0.0330
## INFO [23:48:45.312] epoch 6, loss 0.0325
## INFO [23:48:45.634] epoch 7, loss 0.0322
## INFO [23:48:45.918] epoch 8, loss 0.0320
## INFO [23:48:46.119] epoch 9, loss 0.0318
## INFO [23:48:46.329] epoch 10, loss 0.0316
TED.we <- t(TED.glove$components) + TED.we# unique representation
In order to visualize the learned word embeddings, we will create two plots. The first plot will depict the vectors of the 100 most frequently used words (i.e., the 100 words with the largest frequencies). The second plot will show all of the words, but only label a subset of them.
index <- textstat_frequency(dfm(TED.tk))[1:100, ]$feature
## words with the 100 largest frequencies
data.for.plot <- data.frame(TED.we[index, ])
data.for.plot$word <- row.names(data.for.plot)
Emb_p1 <- ggplot(data.for.plot,
aes(x = X1,
y = X2,
label = word)) +
geom_text_repel(max.overlaps = 100)+
theme_void() +
labs(title="map of top100 words")
TED.we.df <- as.data.frame(TED.we)
word <- rownames(TED.we.df)
TED.we.df <- cbind(word,TED.we.df)
e <- c(1:15045)
row.names(TED.we.df) <- e
Emb_p2 <- ggplot(TED.we.df,aes(x=V1,y=V2))+
geom_text_repel(data = subset(TED.we.df, V1 <=-1.8|V2>3|V1>2),
mapping = aes(label = word),
hjust = "inward",
max.overlaps = 100) +
geom_point(color="grey")+
labs(title="map of all words(partially labeled)")
Emb_p1
Emb_p2
To avoid label overlap between data points in the plots, we will use the geom_text_repel function. Some labels will also be accompanied by a black line, indicating the location of the corresponding data point.
The first plot depicts the relationships between frequently used words. It can be seen that words that are close in the embedding space are often used together. For example, the words “robot” and “computer” are close, indicating that they are frequently used together. Similarly, the words “man” and “woman” are close, suggesting that they are also commonly used together.
The second plot presents the distribution of all used words, with a subset labeled for illustration. This plot shows that words such as “carbon” and “emission” are close in the embedding space, indicating that they are often used together.
We now build the document embedding by computing the centroids of the documents.
kable(TED.we[TED.tk[[1]], ],
col.names = c("dimension1","dimension2"),
caption = "word vectors") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| dimension1 | dimension2 | |
|---|---|---|
| today | 0.0146311 | 2.4838196 |
| artificial | 1.1660034 | 1.0971134 |
| intelligence | 1.5393247 | 1.4062923 |
| help | 0.3788145 | 1.2607835 |
| doctor | -0.2729373 | 0.8436303 |
| diagnose | -0.6085475 | 0.2545526 |
| patient | -0.4756314 | 0.7942109 |
| pilot | 0.3057945 | -0.4767435 |
| fly | 0.8232728 | 0.1019763 |
| commercial | 0.1456382 | -0.3353500 |
| aircraft | 0.3484717 | -0.8078333 |
| city | -0.1908626 | 1.6693269 |
| planner | -0.7025986 | -0.5691790 |
| predict | 0.1421813 | 1.0358475 |
| traffic | -1.0712650 | 0.3900522 |
| matter | 0.9219089 | 1.1550515 |
| ais | 0.1669588 | 0.0249644 |
| computer | 1.4561314 | 1.9093218 |
| scientist | 0.5762930 | 1.0029609 |
| design | 0.7048156 | 1.4889276 |
| artificial | 1.1660034 | 1.0971134 |
| intelligence | 1.5393247 | 1.4062923 |
| self-taught | -0.2932413 | -0.1437810 |
| work | 1.5092763 | 3.4837919 |
| simple | 0.6056631 | 1.2115127 |
| set | 0.5339148 | 1.3200916 |
| instruction | -0.3186817 | -0.1288832 |
| create | 1.2345958 | 2.1137113 |
| unique | -0.3828762 | 0.6437062 |
| array | 0.1231429 | -0.3552229 |
| rule | 0.1636074 | 0.9138203 |
| strategy | -0.6884873 | 0.4753147 |
| machine | 0.9781415 | 1.9614004 |
| learn | 1.7738640 | 2.2552487 |
| way | 0.4804493 | 1.7365515 |
| build | 1.0606711 | 2.4207192 |
| self-teaching | -0.1071285 | -0.2319459 |
| program | 0.7276333 | 0.8668290 |
| rely | 0.0242976 | -0.0960297 |
| basic | -0.2115567 | 0.6799598 |
| type | 0.7962711 | 0.7498920 |
| machine | 0.9781415 | 1.9614004 |
| learn | 1.7738640 | 2.2552487 |
| unsupervised | -0.0766919 | -0.1160724 |
| learn | 1.7738640 | 2.2552487 |
| supervise | 0.4242013 | -0.0635206 |
| learn | 1.7738640 | 2.2552487 |
| reinforcement | 0.6292188 | -0.3726759 |
| learn | 1.7738640 | 2.2552487 |
| action | 0.4963097 | 1.1906332 |
| imagine | 1.0990429 | 1.5820785 |
| researcher | 0.4231239 | 0.6202900 |
| pull | 0.0598999 | 0.5287256 |
| information | 0.5589172 | 1.0472058 |
| set | 0.5339148 | 1.3200916 |
| medical | 0.0258312 | 0.4652235 |
| datum | 0.6112607 | 2.1383116 |
| thousand | -0.4950527 | 1.6905559 |
| patient | -0.4756314 | 0.7942109 |
| profile | -0.1634848 | -0.1582078 |
| unsupervised | -0.0766919 | -0.1160724 |
| learn | 1.7738640 | 2.2552487 |
| approach | 0.4506895 | 0.6724980 |
| ideal | 0.7867587 | -0.0705095 |
| analyze | -0.3922811 | 0.3762168 |
| profile | -0.1634848 | -0.1582078 |
| find | 1.3106410 | 2.6129258 |
| general | 0.8329176 | 0.4257263 |
| similarity | 0.2177481 | -0.3003376 |
| pattern | 0.4733686 | 0.7064781 |
| patient | -0.4756314 | 0.7942109 |
| similar | 0.5815326 | 0.3094706 |
| disease | -1.3919198 | 0.8973716 |
| presentation | 0.0161713 | -0.7183924 |
| treatment | -0.7518382 | 0.0735639 |
| produce | 0.2735283 | 0.7296061 |
| specific | 0.1971707 | 0.4065676 |
| set | 0.5339148 | 1.3200916 |
| side | 0.2506871 | 1.1470893 |
| effect | -0.9940202 | 1.1064776 |
| broad | -0.2156932 | 0.1397078 |
| pattern-seeking | 0.0606716 | 0.3280154 |
| approach | 0.4506895 | 0.6724980 |
| identify | 0.3988025 | 0.5737686 |
| similarity | 0.2177481 | -0.3003376 |
| patient | -0.4756314 | 0.7942109 |
| profile | -0.1634848 | -0.1582078 |
| find | 1.3106410 | 2.6129258 |
| emerge | 0.7936948 | -0.1188745 |
| pattern | 0.4733686 | 0.7064781 |
| human | 0.7873912 | 3.4115228 |
| guidance | 0.1456853 | -0.7737505 |
| imagine | 1.0990429 | 1.5820785 |
| doctor | -0.2729373 | 0.8436303 |
| specific | 0.1971707 | 0.4065676 |
| physician | -0.1596647 | -0.7753850 |
| create | 1.2345958 | 2.1137113 |
| algorithm | 1.1644138 | 1.0141295 |
| diagnose | -0.6085475 | 0.2545526 |
| condition | -0.0100761 | 0.6303259 |
| begin | 0.8284561 | 1.5427773 |
| collect | 0.4072523 | 0.4034116 |
| set | 0.5339148 | 1.3200916 |
| datum | 0.6112607 | 2.1383116 |
| medical | 0.0258312 | 0.4652235 |
| image | 1.0811438 | 0.7398235 |
| test | 0.9260422 | 0.4222638 |
| result | 0.1441240 | 1.0492582 |
| healthy | 0.2162610 | 0.6087022 |
| patient | -0.4756314 | 0.7942109 |
| diagnose | -0.6085475 | 0.2545526 |
| condition | -0.0100761 | 0.6303259 |
| input | 0.2005409 | -0.1673951 |
| datum | 0.6112607 | 2.1383116 |
| program | 0.7276333 | 0.8668290 |
| design | 0.7048156 | 1.4889276 |
| identify | 0.3988025 | 0.5737686 |
| feature | 0.0448011 | -0.2396647 |
| share | 0.3650489 | 1.7650883 |
| sick | -0.1395801 | 0.2124977 |
| patient | -0.4756314 | 0.7942109 |
| healthy | 0.2162610 | 0.6087022 |
| patient | -0.4756314 | 0.7942109 |
| base | 0.6835954 | 0.9794220 |
| frequently | 0.4453416 | -0.7950179 |
| see | 0.2641034 | 0.1415384 |
| feature | 0.0448011 | -0.2396647 |
| program | 0.7276333 | 0.8668290 |
| assign | -0.3634961 | -0.3118342 |
| value | -0.3789236 | 0.9080443 |
| feature | 0.0448011 | -0.2396647 |
| diagnostic | 0.2218103 | -0.4276109 |
| significance | -0.1136311 | -0.1392494 |
| generate | 1.1475955 | 0.3451212 |
| algorithm | 1.1644138 | 1.0141295 |
| diagnose | -0.6085475 | 0.2545526 |
| future | 0.9745135 | 2.3862891 |
| patient | -0.4756314 | 0.7942109 |
| unlike | 0.4745628 | -0.4171598 |
| unsupervised | -0.0766919 | -0.1160724 |
| learn | 1.7738640 | 2.2552487 |
| doctor | -0.2729373 | 0.8436303 |
| computer | 1.4561314 | 1.9093218 |
| scientist | 0.5762930 | 1.0029609 |
| active | 0.1678649 | 0.0331035 |
| role | 0.2140487 | 0.6690150 |
| doctor | -0.2729373 | 0.8436303 |
| make | 1.3236411 | 4.0568007 |
| final | 0.1894689 | 0.1876892 |
| diagnosis | -0.2745019 | -0.1235628 |
| check | 0.2385149 | 0.5567917 |
| accuracy | -0.4688711 | 0.4816281 |
| algorithm’s | 0.4560409 | -0.3834856 |
| prediction | 0.9543240 | 0.2094144 |
| computer | 1.4561314 | 1.9093218 |
| scientist | 0.5762930 | 1.0029609 |
| update | 0.2973457 | -0.5726926 |
| dataset | -0.4789954 | -0.4238427 |
| adjust | -0.1226841 | -0.3823207 |
| program’s | -0.2202482 | -0.2873325 |
| parameter | -0.2576844 | -0.1770638 |
| improve | 0.9276726 | 0.6451846 |
| accuracy | -0.4688711 | 0.4816281 |
| hands-on | -0.0284745 | -0.4517132 |
| approach | 0.4506895 | 0.6724980 |
| call | 1.5892779 | 2.2694519 |
| supervise | 0.4242013 | -0.0635206 |
| learn | 1.7738640 | 2.2552487 |
| doctor | -0.2729373 | 0.8436303 |
| design | 0.7048156 | 1.4889276 |
| algorithm | 1.1644138 | 1.0141295 |
| recommend | 0.5713379 | -0.3215806 |
| treatment | -0.7518382 | 0.0735639 |
| plan | 0.2369646 | 0.9063811 |
| plan | 0.2369646 | 0.9063811 |
| implement | -0.0792238 | 0.0830680 |
| stage | 0.4365981 | 0.5842383 |
| change | 0.4560638 | 2.8613491 |
| depend | 0.0480523 | 0.4532660 |
| individual’s | -0.6186286 | -0.2865487 |
| response | 0.6711636 | 0.4539025 |
| treatment | -0.7518382 | 0.0735639 |
| doctor | -0.2729373 | 0.8436303 |
| decide | 0.5469986 | 1.3677213 |
| reinforcement | 0.6292188 | -0.3726759 |
| learn | 1.7738640 | 2.2552487 |
| program | 0.7276333 | 0.8668290 |
| iterative | 0.2897234 | -0.4222767 |
| approach | 0.4506895 | 0.6724980 |
| gather | 0.3756253 | -0.0332146 |
| feedback | -0.1647712 | 0.1707671 |
| medication | -0.1391621 | -0.6477464 |
| dosage | -0.0785653 | -0.5983784 |
| treatment | -0.7518382 | 0.0735639 |
| effective | -0.4747538 | 0.8294426 |
| compare | 0.0914128 | 0.2992472 |
| datum | 0.6112607 | 2.1383116 |
| patient’s | -0.2402303 | -0.6235391 |
| profile | -0.1634848 | -0.1582078 |
| create | 1.2345958 | 2.1137113 |
| unique | -0.3828762 | 0.6437062 |
| optimal | 0.1127345 | -0.6979981 |
| treatment | -0.7518382 | 0.0735639 |
| plan | 0.2369646 | 0.9063811 |
nd <- length(TED.tk) # number of documents
TED.de <- matrix(nr = nd, nc = p) # document embedding matrix (1 document per row)
for(i in 1:nd) {
words_in_i <- TED.we[TED.tk[[i]], , drop = FALSE]
# drop = FALSE is needed in case there is only one token
TED.de[i, ] <- apply(words_in_i, 2 ,mean)
}
row.names(TED.de) <- names(TED.tk)
kable(TED.de,
col.names = c("dimension1","dimension2"),
caption = "document vectors") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| dimension1 | dimension2 | |
|---|---|---|
| text1 | 0.3206617 | 0.7237736 |
| text2 | 0.5168198 | 0.8834374 |
| text3 | 0.4637863 | 1.0714242 |
| text4 | 0.4544400 | 1.1999466 |
| text5 | 0.4017896 | 1.0393213 |
| text6 | 0.5689656 | 1.2473630 |
| text7 | 0.4954720 | 1.1054186 |
| text8 | 0.5892759 | 0.9642223 |
| text9 | 0.7444610 | 1.1674419 |
| text10 | 0.9134851 | 1.0555264 |
| text11 | 0.4840624 | 0.9365214 |
| text12 | 0.6928026 | 1.2934775 |
| text13 | 0.5043109 | 1.2950328 |
| text14 | 0.6584153 | 1.5214003 |
| text15 | 0.6272884 | 1.9065189 |
| text16 | 0.4838971 | 1.1489542 |
| text17 | 0.5824606 | 1.3184144 |
| text18 | 0.5203449 | 1.2690816 |
| text19 | 0.4798303 | 1.2869815 |
| text20 | 0.6433493 | 1.2282047 |
| text21 | 0.3754559 | 1.1195650 |
| text22 | 0.4803191 | 0.9238710 |
| text23 | 0.3198699 | 1.1785389 |
| text24 | 0.5212600 | 1.1148524 |
| text25 | 0.5222708 | 1.2437906 |
| text26 | 0.4637791 | 1.3210792 |
| text27 | 0.4369534 | 1.5134178 |
| text28 | 0.2976676 | 0.8315419 |
| text29 | 0.5619417 | 0.9849799 |
| text30 | 0.5243376 | 1.0631070 |
| text31 | 0.4950939 | 0.9510074 |
| text32 | 0.5555324 | 1.6334259 |
| text33 | 0.3068343 | 1.5200304 |
| text34 | 0.4774039 | 1.1378388 |
| text35 | 0.4215191 | 1.1132579 |
| text36 | 0.6036234 | 1.3010224 |
| text37 | 0.4034977 | 1.1765178 |
| text38 | 0.4696785 | 0.6848206 |
| text39 | 0.4548243 | 1.0290053 |
| text40 | 0.4784990 | 1.0066356 |
| text41 | 0.5857135 | 1.0320285 |
| text42 | 0.3311251 | 0.5148928 |
| text43 | 0.4894920 | 0.7873627 |
| text44 | 0.3200625 | 0.6254700 |
| text45 | 0.3958189 | 1.0876466 |
| text46 | 0.3421299 | 0.6349629 |
| text47 | 0.4718678 | 1.0061852 |
| text48 | 0.2342618 | 0.7707632 |
| text49 | 0.5363185 | 1.1644034 |
| text50 | 0.5795513 | 1.2368919 |
| text51 | 0.2243335 | 0.7091524 |
| text52 | 0.1127654 | 0.7857429 |
| text53 | 0.4082809 | 0.7978436 |
| text54 | 0.3483846 | 0.9244197 |
| text55 | 0.3643745 | 0.7139187 |
| text56 | 0.3892213 | 1.0528155 |
| text57 | 0.4034803 | 0.0928330 |
| text58 | 0.4078615 | 0.8066272 |
| text59 | 0.5903781 | 0.8954662 |
| text60 | 0.5488916 | 1.2879928 |
| text61 | 0.2619769 | 0.7158324 |
| text62 | 0.4300415 | 1.2665044 |
| text63 | 0.3006066 | 1.1733468 |
| text64 | 0.5148443 | 0.9284579 |
| text65 | 0.3110855 | 0.8847926 |
| text66 | 0.5226286 | 1.1707530 |
| text67 | 0.5523866 | 1.2136171 |
| text68 | 0.6397303 | 1.2467844 |
| text69 | 0.3943587 | 1.3606282 |
| text70 | 0.3877844 | 0.8055777 |
| text71 | 0.3077556 | 0.9750718 |
| text72 | 0.5158158 | 0.9314795 |
| text73 | 0.3230706 | 0.8906216 |
| text74 | 0.2818193 | 0.7743422 |
| text75 | 0.3846178 | 0.8479019 |
| text76 | 0.3714443 | 1.1474931 |
| text77 | 0.4481493 | 1.1846998 |
| text78 | 0.3566164 | 0.8591778 |
| text79 | 0.1270389 | 1.0520492 |
| text80 | 0.1521209 | 1.1354050 |
| text81 | 0.2298767 | 1.0790546 |
| text82 | 0.3116077 | 1.0006355 |
| text83 | 0.4602098 | 1.2351271 |
| text84 | 0.3734525 | 0.9809176 |
| text85 | 0.6043153 | 1.0504681 |
| text86 | 0.4889955 | 1.0172434 |
| text87 | 0.4167489 | 0.8212637 |
| text88 | 0.9485383 | 1.3098636 |
| text89 | 0.4023682 | 0.4658491 |
| text90 | 0.2624928 | 0.7790413 |
| text91 | 0.4520944 | 1.0202273 |
| text92 | 0.3813778 | 0.8997004 |
| text93 | 0.0919997 | 0.5361382 |
| text94 | 0.1552662 | 0.5919150 |
| text95 | 0.2767750 | 0.8275219 |
| text96 | 0.6044990 | 0.8167614 |
| text97 | 0.6686795 | 0.9698679 |
| text98 | 0.5352579 | 0.7939442 |
| text99 | 0.4265147 | 1.1560104 |
| text100 | 0.6481380 | 1.3678651 |
| text101 | 0.7108251 | 1.3877404 |
| text102 | 0.6371138 | 1.1079373 |
| text103 | 0.5587647 | 1.0302039 |
| text104 | 0.3878218 | 0.9991143 |
| text105 | 0.2940748 | 1.2156087 |
| text106 | 0.3567563 | 1.0066551 |
| text107 | 0.2125711 | 0.8670340 |
| text108 | 0.2788008 | 0.7771835 |
| text109 | 0.3484636 | 0.9809036 |
| text110 | 0.4081205 | 1.0180798 |
| text111 | 0.3562450 | 1.0323871 |
| text112 | 0.3010308 | 0.9542504 |
| text113 | 0.6694914 | 1.0995979 |
| text114 | 0.6044472 | 1.4135989 |
| text115 | 0.0877663 | 0.8512282 |
| text116 | -0.0402707 | 0.7489687 |
| text117 | 0.4156652 | 0.3264468 |
| text118 | 0.4801959 | 0.9715000 |
| text119 | 0.4321792 | 0.7451329 |
| text120 | 0.4485178 | 0.6282799 |
| text121 | 0.4530286 | 0.7004461 |
| text122 | 0.6686352 | 1.5843726 |
| text123 | 0.4147788 | 0.8839076 |
| text124 | 0.3945454 | 0.7278247 |
| text125 | 0.3615401 | 0.7085455 |
| text126 | 0.8196714 | 1.3590607 |
| text127 | 0.1619974 | 0.4987173 |
| text128 | -0.0370267 | 0.2157868 |
| text129 | 0.3802833 | 0.8947542 |
| text130 | 0.3644942 | 0.5787015 |
| text131 | 0.3009267 | 0.6003012 |
| text132 | 0.4743491 | 1.1244929 |
| text133 | 0.4385638 | 1.1535105 |
| text134 | 0.4156158 | 1.0561881 |
| text135 | 0.3547919 | 0.9661529 |
| text136 | 0.3381186 | 1.1974565 |
| text137 | 0.0331316 | 0.6962642 |
| text138 | 0.2601172 | 0.8865109 |
| text139 | 0.3534057 | 0.8488921 |
| text140 | 0.5411832 | 0.7372633 |
| text141 | 0.3729944 | 0.4601038 |
| text142 | 0.1955495 | 0.6333327 |
| text143 | 0.3585554 | 0.8393556 |
| text144 | 0.5691128 | 1.0787401 |
| text145 | 0.2135175 | 0.6151303 |
| text146 | 0.3522584 | 0.7303662 |
| text147 | 0.1362442 | 0.4243938 |
| text148 | 0.4975846 | 1.0726621 |
| text149 | 0.4282436 | 0.9405778 |
| text150 | 0.4559913 | 0.8309308 |
| text151 | 0.4611397 | 1.0313080 |
| text152 | 0.4943939 | 1.0703429 |
| text153 | 0.5063099 | 1.0772047 |
| text154 | -0.0673418 | 0.9368247 |
| text155 | 0.5506096 | 1.0945753 |
| text156 | 0.6738236 | 1.1453593 |
| text157 | 0.5148479 | 1.2023714 |
| text158 | -0.1067355 | 0.6582978 |
| text159 | 0.3193785 | 0.9671215 |
| text160 | 0.2727967 | 0.7498893 |
| text161 | 0.4600384 | 0.7210538 |
| text162 | 0.3986739 | 0.4417602 |
| text163 | 0.2791975 | 0.6306868 |
| text164 | 0.4379215 | 0.7482065 |
| text165 | 0.3749722 | 0.6507812 |
| text166 | 0.4291544 | 0.6912629 |
| text167 | 0.5885770 | 0.8425847 |
| text168 | 0.5181978 | 0.3004282 |
| text169 | 0.0966545 | 1.1225628 |
| text170 | -0.0372683 | 0.9759113 |
| text171 | 0.4103355 | 0.9379720 |
| text172 | 0.3998991 | 1.2209025 |
| text173 | 0.5243984 | 1.0785700 |
| text174 | 0.5743786 | 1.0568647 |
| text175 | 0.3257795 | 0.6942220 |
| text176 | 0.3218647 | 0.6211355 |
| text177 | 0.4610017 | 0.8875018 |
| text178 | 0.6889232 | 1.4824850 |
| text179 | 0.5276713 | 1.0582643 |
| text180 | 0.3460609 | 0.7122861 |
| text181 | 0.2607080 | 0.7931745 |
| text182 | 0.4814475 | 1.2782283 |
| text183 | 0.3369461 | 0.7118959 |
| text184 | 0.3540776 | 0.8371812 |
| text185 | 0.3487186 | 0.9645584 |
| text186 | 0.4081300 | 0.9069947 |
| text187 | 1.0087831 | 2.0810934 |
| text188 | 0.5101811 | 0.7871550 |
| text189 | 0.4103001 | 0.7653051 |
| text190 | 0.5815079 | 0.9709981 |
| text191 | 0.2524182 | 0.6310216 |
| text192 | 0.3519514 | 1.1128879 |
| text193 | 0.4222532 | 0.9183228 |
| text194 | 0.3694925 | 0.9846042 |
| text195 | 0.3015085 | 0.8247329 |
| text196 | 0.3172701 | 1.3160555 |
| text197 | 0.2634322 | 0.9562766 |
| text198 | 0.3524151 | 1.1250820 |
| text199 | 0.4507387 | 0.9652661 |
| text200 | 0.4358790 | 0.9467873 |
| text201 | 0.3860008 | 0.9364039 |
| text202 | 0.2109027 | 0.7955251 |
| text203 | 0.4744664 | 0.9417117 |
| text204 | 0.6416532 | 1.0219148 |
| text205 | 0.3468441 | 0.9228163 |
| text206 | 0.4984358 | 1.0447047 |
| text207 | 0.4769551 | 0.9649770 |
| text208 | 0.5748775 | 1.2606526 |
| text209 | 0.2371040 | 0.9528369 |
| text210 | 0.0905180 | 0.6934480 |
| text211 | 0.3098736 | 0.8572256 |
| text212 | 0.4134373 | 0.8350207 |
| text213 | 0.7066240 | 1.0806379 |
| text214 | 0.3913807 | 0.7533351 |
| text215 | 0.6158694 | 0.8897416 |
| text216 | 0.5676432 | 0.8182434 |
| text217 | 0.2521956 | 0.8231563 |
| text218 | 0.5675986 | 0.9157264 |
| text219 | 0.3746782 | 0.7134432 |
| text220 | 0.4051912 | 0.8739600 |
| text221 | 0.6533390 | 1.6319692 |
| text222 | 0.7512298 | 1.2493351 |
| text223 | 0.8525477 | 1.3324730 |
| text224 | 0.7568779 | 0.8720537 |
| text225 | 0.7333201 | 1.2229223 |
| text226 | 0.5728668 | 0.9469719 |
| text227 | 0.4630637 | 1.1570871 |
| text228 | 0.5361757 | 1.2108375 |
| text229 | 0.6658592 | 1.4532268 |
| text230 | 0.5234289 | 1.6235470 |
| text231 | 0.8065061 | 1.5481845 |
| text232 | 0.4754435 | 1.2318593 |
| text233 | 0.4339115 | 0.9017728 |
| text234 | 0.6250199 | 1.3016994 |
| text235 | 0.5695654 | 1.4362820 |
| text236 | 0.3892338 | 1.1564358 |
| text237 | 0.5085825 | 1.0808077 |
| text238 | 0.4541476 | 0.8268188 |
| text239 | 0.5364824 | 1.2984320 |
| text240 | 0.3596671 | 0.8552751 |
| text241 | 0.3499699 | 0.7835426 |
| text242 | 0.2994613 | 0.6012807 |
| text243 | 0.2728187 | 0.8291944 |
| text244 | 0.4045749 | 0.8779587 |
| text245 | 0.5889522 | 1.1442168 |
| text246 | 0.4112286 | 0.8936758 |
| text247 | 0.4358270 | 0.8482789 |
| text248 | 0.3760626 | 0.8577306 |
| text249 | 0.4322426 | 0.7384925 |
| text250 | 0.3620623 | 0.8926993 |
| text251 | 0.4441125 | 1.1085689 |
| text252 | 0.4235251 | 0.5425342 |
| text253 | 0.4162005 | 0.6710929 |
| text254 | 0.3402889 | 0.5154874 |
| text255 | 0.4296784 | 0.6265803 |
| text256 | 0.5593536 | 0.5516163 |
| text257 | 0.5877996 | 1.0068498 |
| text258 | 0.5865649 | 0.6681438 |
| text259 | 0.5515262 | 0.9164245 |
| text260 | 0.5404999 | 0.6473520 |
| text261 | 0.5641910 | 0.5906715 |
| text262 | 0.3664958 | 0.6933390 |
| text263 | 0.5750177 | 0.8976974 |
| text264 | 0.4417726 | 0.7633179 |
| text265 | 0.5779450 | 0.9020401 |
| text266 | 0.4173044 | 0.8054296 |
| text267 | 0.6458272 | 1.1239346 |
| text268 | 0.5187426 | 1.0653581 |
| text269 | 0.1859515 | 0.6668625 |
| text270 | 0.5231189 | 1.2160222 |
| text271 | 0.5822334 | 1.1806224 |
| text272 | 0.3775930 | 0.7541758 |
| text273 | 0.3994397 | 1.1371449 |
| text274 | 0.2224976 | 0.6551238 |
| text275 | 0.3061505 | 0.9628953 |
| text276 | 0.5888407 | 1.3623078 |
| text277 | 0.6706179 | 0.9223544 |
| text278 | 0.5793736 | 0.9047169 |
| text279 | 0.5131950 | 0.9682427 |
| text280 | 0.7143250 | 1.1057903 |
| text281 | 0.3583595 | 1.1583379 |
| text282 | 0.5014974 | 1.1530740 |
| text283 | 0.4869178 | 1.4727163 |
| text284 | 0.2302536 | 0.4458393 |
| text285 | 0.4075699 | 0.9003071 |
| text286 | 0.4672961 | 0.8592696 |
| text287 | 0.2884198 | 0.7537649 |
| text288 | 0.1487489 | 0.7115874 |
| text289 | 0.3860820 | 0.8660427 |
| text290 | 0.0947782 | 0.9705376 |
| text291 | 0.3603031 | 0.9483341 |
| text292 | 0.3195565 | 0.8272410 |
| text293 | 0.3161433 | 0.8137971 |
| text294 | 0.3106450 | 0.8380243 |
| text295 | 0.6092089 | 1.2607875 |
| text296 | 0.3434946 | 1.0081080 |
| text297 | 0.1489391 | 0.8250209 |
| text298 | 0.1652102 | 0.7243024 |
| text299 | 0.4783177 | 0.9326762 |
| text300 | 0.4507939 | 1.0134128 |
| text301 | 0.4839123 | 0.8827206 |
| text302 | 0.3538614 | 0.8023059 |
| text303 | 0.1321880 | 0.6432053 |
| text304 | 0.3628639 | 1.1582999 |
| text305 | 0.5456020 | 1.0000607 |
| text306 | 0.5911184 | 1.1322215 |
| text307 | 0.5364341 | 0.9237803 |
| text308 | 0.4529509 | 1.0790554 |
| text309 | 0.6369687 | 1.1589961 |
| text310 | 0.6096742 | 1.1121716 |
| text311 | 0.5852394 | 1.3409308 |
| text312 | 0.5174692 | 1.0046196 |
| text313 | 0.5758890 | 1.1374553 |
| text314 | 0.6381163 | 1.0766066 |
| text315 | 0.7385720 | 1.3139897 |
| text316 | 0.5486983 | 1.1894088 |
| text317 | 0.6533801 | 1.1212686 |
| text318 | 0.6787738 | 1.4063482 |
| text319 | 0.7531051 | 1.3590422 |
| text320 | 0.5421495 | 1.2417081 |
| text321 | 0.2997608 | 0.8806871 |
| text322 | 0.3785156 | 0.5894474 |
| text323 | 0.6126860 | 1.1152720 |
| text324 | 0.7108398 | 1.3301172 |
| text325 | 0.4290399 | 1.0490301 |
| text326 | 0.3007878 | 0.4384491 |
| text327 | 0.9013264 | 1.1753853 |
| text328 | 0.6540025 | 1.2779625 |
| text329 | 0.4894566 | 1.2498517 |
| text330 | 0.6779919 | 1.5056634 |
| text331 | 0.5431027 | 0.8730582 |
| text332 | 0.3632790 | 0.7836764 |
| text333 | 0.6575295 | 1.2916840 |
| text334 | 0.4350176 | 0.9229447 |
| text335 | -0.1686731 | 0.5566461 |
| text336 | 0.3828682 | 0.8388799 |
| text337 | 0.6598375 | 1.2501046 |
| text338 | 0.4364072 | 0.8248634 |
| text339 | 0.6675804 | 1.1242619 |
| text340 | 0.5178689 | 0.9389688 |
| text341 | 0.5722776 | 1.1527179 |
| text342 | 0.4477138 | 0.9375845 |
| text343 | 0.4111251 | 0.8687484 |
| text344 | 0.4111028 | 0.8154774 |
| text345 | 0.2475118 | 1.1537942 |
| text346 | 0.2920594 | 0.6755002 |
| text347 | 0.2145114 | 0.9147073 |
| text348 | 0.2779102 | 1.0582371 |
| text349 | 0.2317715 | 1.1851195 |
| text350 | 0.2645707 | 1.1364828 |
| text351 | 0.5605146 | 0.6170717 |
| text352 | 0.3368638 | 0.5514239 |
| text353 | 0.6904633 | 1.0950106 |
| text354 | 0.5762414 | 0.8963380 |
| text355 | 0.6778888 | 1.5753139 |
| text356 | 0.8274416 | 1.4061343 |
| text357 | 0.7007740 | 1.1849960 |
| text358 | 0.8912644 | 1.5248712 |
| text359 | 0.5486147 | 1.0843863 |
| text360 | 0.6323086 | 1.0960259 |
| text361 | 0.6978919 | 0.8985863 |
| text362 | 0.7039666 | 1.0516614 |
| text363 | 0.6960935 | 1.1973428 |
| text364 | 0.6836207 | 1.1934184 |
| text365 | 0.6747005 | 0.8483037 |
| text366 | 0.7416662 | 1.0659735 |
| text367 | 0.6739945 | 1.3904267 |
| text368 | 0.0181661 | 0.9297735 |
| text369 | 0.0611222 | 0.8931742 |
| text370 | 0.1092474 | 0.8113496 |
| text371 | 0.2913354 | 0.8863505 |
| text372 | 0.6510522 | 1.3471065 |
| text373 | 0.8627525 | 0.9797336 |
| text374 | 0.5573225 | 0.8980832 |
| text375 | 0.6591690 | 1.1420674 |
| text376 | 0.5375531 | 1.4174926 |
| text377 | 0.5501759 | 1.2557452 |
| text378 | 0.4111817 | 0.9915591 |
| text379 | 0.4465952 | 0.8356191 |
| text380 | 0.4717061 | 0.8727453 |
| text381 | 0.3318749 | 0.8538775 |
| text382 | 0.3586489 | 0.6933831 |
| text383 | 0.4545471 | 1.2606523 |
| text384 | 0.2067255 | 0.7371987 |
| text385 | 0.2653941 | 0.8916448 |
| text386 | 0.3158068 | 0.7768861 |
| text387 | 0.4476452 | 0.9919602 |
| text388 | 0.4603245 | 1.1308930 |
| text389 | 0.4028625 | 0.9506024 |
| text390 | 0.3648852 | 1.2660066 |
| text391 | 0.3336279 | 0.7947007 |
| text392 | 0.4240173 | 0.8614743 |
| text393 | 0.5399458 | 0.8674275 |
| text394 | 0.4606426 | 0.8902993 |
| text395 | -0.1490854 | 0.4710238 |
| text396 | 0.1232572 | 0.7244119 |
| text397 | 0.3336279 | 0.7947007 |
| text398 | 0.4240173 | 0.8614743 |
| text399 | 0.0324851 | 1.6214489 |
| text400 | 0.4847310 | 1.2295748 |
| text401 | 0.2460864 | 1.4184868 |
| text402 | -0.0812260 | 1.3114370 |
| text403 | 0.3962331 | 1.8826016 |
| text404 | 0.0324851 | 1.6214489 |
| text405 | 0.4847310 | 1.2295748 |
| text406 | 0.2460864 | 1.4184868 |
| text407 | -0.0812260 | 1.3114370 |
| text408 | 0.3962331 | 1.8826016 |
| text409 | 0.7636168 | 1.4618025 |
| text410 | 0.5416107 | 1.1363545 |
| text411 | 0.4415958 | 1.3942051 |
| text412 | 0.2536960 | 0.9159228 |
| text413 | 0.4224041 | 1.0697696 |
| text414 | 0.5269848 | 1.1114522 |
| text415 | 0.5449737 | 1.1077517 |
| text416 | 0.5623477 | 1.1003592 |
| text417 | 0.6066045 | 1.1497322 |
| text418 | 0.7364349 | 1.3319911 |
| text419 | 0.6547252 | 1.0819061 |
| text420 | 0.6650692 | 1.2006068 |
| text421 | 0.3049214 | 1.1779172 |
| text422 | 0.2788703 | 0.6016715 |
| text423 | 0.2457381 | 0.5783201 |
| text424 | 0.4903861 | 0.8902466 |
| text425 | 0.3063141 | 0.8518383 |
| text426 | 0.5282662 | 0.8221826 |
| text427 | 0.7067025 | 1.4050319 |
| text428 | 0.3109759 | 0.8314517 |
| text429 | 0.2383530 | 0.6825512 |
| text430 | 0.3627732 | 0.9652688 |
| text431 | 0.2821810 | 0.6604843 |
| text432 | 0.1583457 | 0.3659142 |
| text433 | 0.3722378 | 0.9183564 |
| text434 | 0.0943876 | 0.2513951 |
| text435 | 0.2570220 | 0.4158056 |
| text436 | 0.2492637 | 1.2207262 |
| text437 | 0.0973948 | 0.6703182 |
| text438 | 0.2435318 | 0.9982332 |
| text439 | 0.5264129 | 0.9905574 |
| text440 | 0.5213192 | 0.9295275 |
| text441 | 0.6005908 | 1.1469997 |
| text442 | 0.3617289 | 0.9668233 |
| text443 | 0.3561566 | 0.8134634 |
| text444 | 0.5458376 | 0.9733046 |
| text445 | 0.4000388 | 1.0210860 |
| text446 | 0.2724199 | 0.6842588 |
| text447 | 0.2140167 | 0.8177863 |
| text448 | 0.3405735 | 0.7380054 |
| text449 | 0.4151916 | 0.6122534 |
| text450 | 0.4282795 | 0.9446069 |
| text451 | 0.3767534 | 0.9139113 |
| text452 | 0.3923205 | 0.7035225 |
| text453 | 0.3747714 | 0.9488647 |
| text454 | 0.4279709 | 1.1021225 |
| text455 | 0.3794760 | 0.7522963 |
| text456 | 0.3901892 | 0.6645998 |
| text457 | 0.2936933 | 0.5994919 |
| text458 | 0.3456399 | 0.7206894 |
| text459 | 0.2135812 | 0.6959457 |
| text460 | 0.4725863 | 0.6310257 |
| text461 | 0.2805683 | 0.6692853 |
| text462 | 0.3059800 | 0.8939639 |
| text463 | 0.5622762 | 1.0294075 |
| text464 | 0.3417995 | 0.8653081 |
| text465 | 0.4388044 | 0.7562184 |
| text466 | 0.3814466 | 0.9796112 |
| text467 | 0.5461464 | 1.2855264 |
| text468 | 0.7086750 | 1.3706196 |
| text469 | 0.5792381 | 0.7884153 |
| text470 | 0.6611932 | 1.0877380 |
| text471 | 0.5946199 | 0.9807742 |
| text472 | 0.6346866 | 1.4206694 |
| text473 | 0.4936046 | 1.2635794 |
| text474 | 0.6275573 | 1.1455610 |
| text475 | 0.7827970 | 0.8744029 |
| text476 | 0.6570804 | 1.1596188 |
| text477 | 0.5788993 | 1.1016424 |
| text478 | 0.5890231 | 1.1909359 |
| text479 | 0.5176312 | 1.1492925 |
| text480 | 0.5202415 | 1.0992237 |
| text481 | 0.6261005 | 1.1914991 |
| text482 | 0.4535387 | 1.0582841 |
| text483 | 0.4988018 | 1.1873816 |
| text484 | 0.3865327 | 1.1916586 |
| text485 | 0.4496100 | 1.0968994 |
| text486 | 0.3053093 | 1.0117924 |
| text487 | 0.5296345 | 1.2555957 |
| text488 | 0.2979634 | 1.0819131 |
| text489 | 0.3758559 | 1.3678849 |
| text490 | 0.4994139 | 1.0203396 |
| text491 | 0.5303381 | 1.2575678 |
| text492 | 0.3891828 | 1.2662312 |
| text493 | 0.6699178 | 1.6204592 |
| text494 | 0.6326254 | 1.2361086 |
| text495 | 0.4835608 | 1.2946293 |
| text496 | 0.4481256 | 0.6632901 |
| text497 | 0.4683497 | 0.7059215 |
| text498 | 0.2321290 | 0.5334415 |
| text499 | 0.1499390 | 0.3692649 |
| text500 | 0.1221737 | 0.3158575 |
| text501 | 0.2877763 | 0.7844088 |
| text502 | 0.3533936 | 0.5943971 |
| text503 | 0.5061378 | 1.0132473 |
| text504 | 0.3529759 | 0.7829249 |
| text505 | 0.5342618 | 0.7955358 |
| text506 | 0.3767789 | 0.9282242 |
| text507 | 0.4380104 | 0.7869228 |
| text508 | 0.4831199 | 1.1945884 |
| text509 | 0.3686871 | 1.1858450 |
| text510 | 0.3892565 | 0.9771548 |
| text511 | 0.4760037 | 0.8467505 |
| text512 | 0.4519858 | 0.8330950 |
| text513 | 0.3451816 | 0.9556122 |
| text514 | 0.4918202 | 1.1857527 |
| text515 | 0.4148545 | 1.1555147 |
| text516 | 0.3883027 | 0.9359205 |
| text517 | 0.4459684 | 1.0254828 |
| text518 | 0.4450975 | 1.1471484 |
| text519 | 0.4661425 | 0.8393674 |
| text520 | 0.4362739 | 0.6764246 |
| text521 | 0.4170402 | 0.7367848 |
| text522 | 0.3334748 | 0.6890834 |
| text523 | 0.4583713 | 0.7996144 |
| text524 | 0.5054252 | 0.7590264 |
| text525 | 0.5593312 | 0.9175907 |
| text526 | 0.3013381 | 0.5809016 |
| text527 | 0.4006458 | 1.0382182 |
| text528 | 0.4969629 | 0.8491344 |
| text529 | 0.4680774 | 0.8960993 |
| text530 | 0.2456444 | 0.7054154 |
| text531 | 0.3947497 | 1.0879829 |
| text532 | 0.5053478 | 0.7526021 |
| text533 | 0.2941464 | 0.9403439 |
| text534 | 0.4917231 | 1.0150058 |
| text535 | 0.5793487 | 1.3905700 |
| text536 | 0.2574683 | 0.6754983 |
| text537 | 0.2421586 | 0.7235267 |
| text538 | 0.1930450 | 0.7316296 |
| text539 | 0.2423321 | 0.7133098 |
| text540 | 0.3654987 | 0.9138540 |
| text541 | 0.2691552 | 0.8593345 |
| text542 | 0.3271189 | 0.9115120 |
| text543 | 0.6669953 | 1.4893106 |
| text544 | 0.2340479 | 0.7967422 |
| text545 | 0.4430416 | 0.9223181 |
| text546 | 0.3235181 | 0.6437722 |
| text547 | 0.3938031 | 0.9223834 |
| text548 | 0.5760293 | 0.8124243 |
| text549 | 0.4280844 | 0.6652461 |
| text550 | 0.4599161 | 1.0829765 |
| text551 | 0.2605843 | 0.4471796 |
| text552 | 0.2137757 | 0.3585518 |
| text553 | 0.1549269 | 0.9094906 |
| text554 | 0.3868859 | 0.7741076 |
| text555 | 0.4705041 | 0.8444059 |
| text556 | 0.4834960 | 0.9054558 |
| text557 | 0.4524661 | 0.9645276 |
| text558 | 0.3456580 | 0.9414422 |
| text559 | 0.3169685 | 0.7528349 |
| text560 | 0.2898704 | 0.5846448 |
| text561 | 0.3237804 | 0.7341319 |
| text562 | 0.3535580 | 0.9094558 |
| text563 | 0.3815686 | 0.6539516 |
| text564 | 0.4022721 | 0.8578486 |
| text565 | 0.4378209 | 0.8874040 |
| text566 | 0.2764456 | 0.9966726 |
| text567 | 0.2965436 | 0.6828687 |
| text568 | 0.2315698 | 0.7766421 |
| text569 | 0.2986824 | 1.0962880 |
| text570 | -0.0334517 | 0.1780741 |
| text571 | 0.4295894 | 0.8880538 |
| text572 | 0.0695282 | 0.6502792 |
| text573 | 0.0662805 | 0.6191795 |
| text574 | 0.1133085 | 0.6634463 |
| text575 | 0.0980991 | 0.6686551 |
| text576 | 0.2185575 | 0.7235560 |
| text577 | 0.1411426 | 0.6582242 |
| text578 | 0.2630089 | 0.6260372 |
| text579 | 0.0274965 | 0.2992423 |
| text580 | 0.0871211 | 0.4397671 |
| text581 | 0.1180869 | 0.4041820 |
| text582 | 0.0729612 | 0.4817363 |
| text583 | 0.0609134 | 0.5763914 |
| text584 | 0.1833580 | 0.5633736 |
| text585 | 0.2393619 | 0.6968864 |
| text586 | 0.3589842 | 0.7810814 |
| text587 | 0.2496806 | 0.6847276 |
| text588 | 0.3111860 | 0.7165724 |
| text589 | 0.1105253 | 0.3943778 |
| text590 | 0.5228282 | 0.9668035 |
| text591 | 0.4224802 | 0.8805288 |
| text592 | 0.2766424 | 0.5591195 |
| text593 | 0.3410504 | 1.1107866 |
| text594 | 0.3329860 | 0.8980952 |
| text595 | 0.2939312 | 0.7837986 |
| text596 | 0.2929778 | 1.0651679 |
| text597 | 0.2325117 | 1.1186062 |
| text598 | 0.4223215 | 1.1188737 |
| text599 | -0.1039860 | 0.4935199 |
| text600 | -0.0299532 | 0.5482112 |
| text601 | -0.0863479 | 0.5107470 |
| text602 | 0.0993963 | 0.6905252 |
| text603 | 0.1447485 | 0.8973417 |
| text604 | 0.3554696 | 1.0601039 |
| text605 | 0.3142802 | 1.1531081 |
| text606 | 0.3379887 | 1.0452997 |
| text607 | 0.2251284 | 0.9454692 |
| text608 | 0.2087186 | 0.8256978 |
| text609 | 0.0584733 | 0.7901222 |
| text610 | 0.0945842 | 0.7808835 |
| text611 | 0.0279997 | 0.7727654 |
| text612 | 0.1809605 | 0.6747807 |
| text613 | 0.2013423 | 0.6145899 |
| text614 | 0.3374288 | 0.9029154 |
| text615 | 0.2211287 | 0.6266821 |
| text616 | 0.3464172 | 0.8963044 |
| text617 | 0.3313698 | 0.4550623 |
| text618 | 0.4041565 | 0.4263668 |
| text619 | 0.3030636 | 0.8073214 |
| text620 | 0.4515504 | 1.3703203 |
| text621 | 0.3444254 | 0.7617646 |
| text622 | 0.2642440 | 0.4573061 |
| text623 | 0.3200919 | 0.6580008 |
| text624 | 0.3265691 | 0.4664276 |
| text625 | 0.2618299 | 0.7795604 |
| text626 | 0.2227268 | 0.9764481 |
| text627 | 0.1958802 | 1.0559906 |
| text628 | 0.6630988 | 1.1730647 |
| text629 | 0.4633545 | 1.2168064 |
| text630 | 0.0622683 | 1.0132832 |
| text631 | 0.2234709 | 0.7374613 |
| text632 | 0.7298667 | 0.8157608 |
| text633 | 0.1596874 | 1.0235660 |
| text634 | 0.2130523 | 0.9276377 |
| text635 | 0.1930663 | 0.9777619 |
| text636 | 0.1611777 | 1.0644796 |
| text637 | 0.2764888 | 0.9615222 |
| text638 | 0.2309535 | 1.0491846 |
| text639 | 0.2975397 | 1.1055129 |
| text640 | 0.2833114 | 0.9315524 |
| text641 | 0.2260793 | 0.9262152 |
| text642 | 0.2304927 | 0.7819986 |
| text643 | 0.1580500 | 0.9886582 |
| text644 | 0.4167398 | 0.9266948 |
| text645 | 0.1371837 | 0.9772813 |
| text646 | 0.0653900 | 0.9244341 |
| text647 | -0.0137722 | 1.0288258 |
| text648 | 0.2881240 | 1.2314367 |
| text649 | 0.2927352 | 0.7830019 |
| text650 | 0.1653117 | 0.7805981 |
| text651 | 0.0765184 | 0.7126130 |
| text652 | 0.2272501 | 0.5691103 |
| text653 | 0.2668969 | 0.9708505 |
| text654 | 0.2429952 | 0.8860592 |
| text655 | 0.1828274 | 0.8452716 |
| text656 | 0.1535291 | 0.8170360 |
| text657 | 0.2327280 | 0.6509025 |
| text658 | 0.2488900 | 0.8284006 |
| text659 | 0.3024061 | 0.5039223 |
| text660 | 0.2097016 | 0.5127723 |
| text661 | 0.0855189 | 0.7565196 |
| text662 | 0.2191628 | 0.9552539 |
| text663 | 0.1711957 | 0.6702492 |
| text664 | 0.3091114 | 0.7944545 |
| text665 | 0.2175615 | 0.6149553 |
| text666 | 0.3769742 | 0.9692661 |
| text667 | 0.4013058 | 0.8608016 |
| text668 | 0.3890355 | 0.7573442 |
| text669 | 0.3174048 | 1.4518669 |
| text670 | 0.2673952 | 1.1063469 |
| text671 | 0.3683995 | 1.1291365 |
| text672 | 0.3299014 | 1.2349571 |
| text673 | 0.2715428 | 1.0148777 |
| text674 | 0.4742844 | 1.4230372 |
| text675 | 0.0152740 | 0.1358917 |
| text676 | -0.0228369 | 0.3003024 |
| text677 | 0.3361106 | 1.5012567 |
| text678 | 0.4710060 | 1.2302274 |
| text679 | 0.1129797 | 1.0719682 |
| text680 | 0.3445823 | 1.1436390 |
| text681 | 0.2690330 | 1.5939554 |
| text682 | 0.1312062 | 1.2698334 |
| text683 | 0.4276737 | 1.6421353 |
| text684 | 0.4146477 | 1.3423827 |
| text685 | 0.5917247 | 1.4378118 |
| text686 | 0.9459415 | 2.5279932 |
| text687 | -0.1291360 | 0.7007305 |
| text688 | -0.0978911 | 0.7814988 |
| text689 | 0.1227579 | 0.4235583 |
| text690 | 0.2362230 | 0.5066612 |
| text691 | 0.1486737 | 0.4912905 |
| text692 | 0.1739896 | 0.5234323 |
| text693 | 0.1981849 | 0.4016833 |
| text694 | 0.1693549 | 0.7972367 |
| text695 | 0.0689340 | 0.1282570 |
| text696 | 0.0705455 | 0.2660557 |
| text697 | 0.1780851 | 0.9667730 |
| text698 | 0.1152921 | 0.8474334 |
| text699 | 0.3772999 | 1.0657481 |
| text700 | 0.4269190 | 0.8154008 |
| text701 | 0.2553940 | 0.8415316 |
| text702 | 0.4046345 | 0.9210907 |
| text703 | 0.3721904 | 0.9363439 |
| text704 | 0.4132706 | 1.1803361 |
| text705 | 0.4060442 | 0.9676214 |
| text706 | 0.3658289 | 0.9250760 |
| text707 | 0.1723121 | 1.0610639 |
| text708 | 0.2533960 | 0.8759706 |
| text709 | 0.0448116 | 1.2322440 |
| text710 | 0.1666317 | 1.2548193 |
| text711 | 0.1750065 | 1.0918283 |
| text712 | 0.1882082 | 1.1515668 |
| text713 | 0.0606094 | 0.7461763 |
| text714 | 0.1998245 | 0.9249785 |
| text715 | 0.1786771 | 0.8320720 |
| text716 | 0.2597678 | 0.7929013 |
| text717 | 0.3775711 | 0.8678196 |
| text718 | 0.1485520 | 0.8098565 |
| text719 | 0.1475177 | 0.9898280 |
| text720 | 0.0638135 | 0.8049831 |
| text721 | 0.0335477 | 0.6080071 |
| text722 | 0.2628711 | 0.5779321 |
| text723 | 0.2913557 | 0.6868808 |
| text724 | 0.4638267 | 0.9138708 |
| text725 | 0.2424569 | 0.6683817 |
| text726 | 0.0822970 | 0.5702475 |
| text727 | 0.2945527 | 0.4886867 |
| text728 | 0.3219784 | 0.8193958 |
| text729 | 0.3924121 | 0.8705939 |
| text730 | 0.6272884 | 1.9065189 |
| text731 | -0.0201161 | 0.9355894 |
| text732 | 0.3797410 | 0.9334669 |
| text733 | 0.2806769 | 0.8311721 |
| text734 | 0.2226679 | 0.4424931 |
| text735 | 0.3836165 | 0.3430723 |
| text736 | 0.5178103 | 0.3231251 |
| text737 | 0.3355663 | 0.7897999 |
| text738 | 0.2851802 | 0.8464597 |
| text739 | 0.4899222 | 0.7730936 |
| text740 | 0.4464685 | 0.8461780 |
| text741 | 0.5211865 | 1.0887351 |
| text742 | 0.4326887 | 0.8907428 |
| text743 | 0.5675421 | 1.1131086 |
| text744 | 0.6063221 | 0.9361048 |
| text745 | 0.4361889 | 0.9602314 |
| text746 | 0.4591122 | 1.1709126 |
| text747 | -0.0764095 | 0.5139257 |
| text748 | 0.0252101 | 0.8057821 |
| text749 | 0.3416995 | 0.5410360 |
| text750 | 0.3543659 | 0.6886979 |
| text751 | 0.3120552 | 1.4183594 |
| text752 | 0.2240763 | 0.3842549 |
| text753 | 0.2007144 | 0.2910380 |
| text754 | 0.3802487 | 0.9641157 |
| text755 | 0.2222751 | 0.8974854 |
| text756 | 0.3502570 | 0.9888363 |
| text757 | 0.3318123 | 0.9026659 |
| text758 | 0.5344042 | 1.2589059 |
| text759 | 0.1382157 | 1.0031314 |
| text760 | 0.1555912 | 0.6433086 |
| text761 | -0.0691632 | 0.8787943 |
| text762 | 0.0065817 | 0.6750111 |
| text763 | 0.1849914 | 0.8459398 |
| text764 | 0.3295841 | 0.8149265 |
| text765 | 0.4809415 | 1.2283721 |
| text766 | 0.5344886 | 1.2521467 |
| text767 | 0.2913409 | 1.2173051 |
| text768 | 0.3199790 | 0.6549389 |
| text769 | 0.3123186 | 1.0754561 |
| text770 | 0.7047253 | 0.9759900 |
| text771 | 0.1848729 | 1.0925748 |
| text772 | 0.2735667 | 1.0886304 |
| text773 | 0.2112135 | 0.9476550 |
| text774 | 0.2112253 | 0.5709165 |
| text775 | 0.1897995 | 0.8971882 |
| text776 | 0.2070807 | 0.9733276 |
| text777 | 0.1714180 | 0.5611065 |
| text778 | 0.3413294 | 0.8990942 |
| text779 | 0.3753096 | 0.6270025 |
| text780 | 0.4348233 | 1.1766285 |
| text781 | 0.5845814 | 0.9808911 |
| text782 | 0.2772139 | 0.9493811 |
| text783 | -0.0679278 | 0.9257136 |
| text784 | -0.0252882 | 0.4776777 |
| text785 | -0.0203055 | 0.5663131 |
| text786 | 0.0836436 | 0.6796068 |
| text787 | 0.1642405 | 1.2331158 |
| text788 | 0.1867265 | 0.8206986 |
| text789 | 0.3647425 | 0.9494858 |
| text790 | 0.3905433 | 1.0519011 |
| text791 | 0.4348338 | 0.9613986 |
| text792 | 0.4589755 | 0.9512385 |
| text793 | 0.5060649 | 0.8418195 |
| text794 | 0.2939640 | 0.6184538 |
| text795 | -0.0157205 | 0.6960772 |
| text796 | -0.0165237 | 0.2124111 |
| text797 | 0.2303901 | 0.4657706 |
| text798 | 0.2091971 | 0.4485766 |
| text799 | 0.2589489 | 0.7741467 |
| text800 | 0.1852214 | 0.5218463 |
| text801 | 0.2086554 | 0.5407603 |
| text802 | 0.2674331 | 0.6850427 |
| text803 | 0.1086837 | 0.6684062 |
| text804 | 0.1672056 | 0.7987399 |
| text805 | 0.2553924 | 0.7460255 |
| text806 | 0.0249134 | 0.8734335 |
| text807 | 0.1214563 | 0.9299917 |
| text808 | -0.0105229 | 0.9399185 |
| text809 | -0.0684264 | 1.2400729 |
| text810 | 0.1817545 | 0.9285576 |
| text811 | 0.2134378 | 1.0868545 |
| text812 | 0.1194941 | 0.8590542 |
| text813 | 0.2687328 | 1.1838197 |
| text814 | 0.2719097 | 0.9552397 |
| text815 | 0.3124648 | 1.1497273 |
| text816 | 0.3642700 | 0.7396708 |
| text817 | 0.4887574 | 1.0632700 |
| text818 | -0.0444986 | 0.7784526 |
| text819 | 0.3915008 | 0.9546453 |
| text820 | 0.4531228 | 0.9645095 |
| text821 | 0.4053709 | 0.9345655 |
| text822 | 0.1693502 | 0.7696554 |
| text823 | 0.4370723 | 0.8249434 |
| text824 | 0.4761303 | 0.6239506 |
| text825 | -0.1460253 | 0.2982161 |
| text826 | -0.1594921 | 0.7995953 |
| text827 | -0.1426635 | 0.6414999 |
| text828 | 0.0550021 | 0.6820317 |
| text829 | 0.0998822 | 0.8592586 |
| text830 | 0.2427239 | 0.9422187 |
| text831 | 0.1904095 | 0.7173092 |
| text832 | 0.2125205 | 0.6829322 |
| text833 | 0.2751771 | 1.1210319 |
| text834 | 0.0188953 | 0.9393892 |
| text835 | 0.2005035 | 1.0614384 |
| text836 | 0.3153700 | 1.1655655 |
| text837 | -0.1047345 | 0.7649251 |
| text838 | 0.4064133 | 0.6298224 |
| text839 | 0.2688969 | 0.7377836 |
| text840 | 0.3383892 | 0.5204323 |
| text841 | 0.2742869 | 0.8562710 |
| text842 | 0.2994700 | 0.7612427 |
| text843 | 0.3483857 | 0.9011213 |
| text844 | 0.1187328 | 0.7198810 |
| text845 | 0.0166885 | 0.7663137 |
| text846 | 0.0863185 | 0.9910671 |
| text847 | 0.1553952 | 0.6569005 |
| text848 | 0.3000828 | 0.7168158 |
| text849 | 0.1513385 | 0.6784827 |
| text850 | 0.2174230 | 0.7556625 |
| text851 | 0.0336095 | 0.0809149 |
| text852 | 0.0948324 | 0.8205439 |
| text853 | 0.3005346 | 0.9248053 |
| text854 | 0.1707034 | 0.7926977 |
| text855 | 0.4076521 | 0.9661994 |
| text856 | 0.3441523 | 0.9916751 |
| text857 | 0.2783376 | 0.6867757 |
| text858 | 0.0569085 | 0.8123611 |
| text859 | -0.0031481 | 1.2779563 |
| text860 | 0.2062647 | 0.9021047 |
| text861 | 0.0495823 | 0.5440119 |
| text862 | -0.0963882 | 0.8580346 |
| text863 | -0.1146138 | 1.0042798 |
| text864 | 0.4352526 | 1.5416993 |
| text865 | 0.0225329 | 0.4771603 |
| text866 | 0.2170909 | 0.7751145 |
| text867 | 0.1149822 | 0.7199651 |
| text868 | 0.3324558 | 1.0128571 |
| text869 | 0.2510704 | 0.7063011 |
| text870 | 0.0985918 | 0.7052831 |
| text871 | 0.2527840 | 0.8541872 |
| text872 | 0.3215401 | 0.7799089 |
| text873 | 0.2193769 | 0.8475117 |
| text874 | 0.4032764 | 1.0500994 |
| text875 | 0.0934701 | 0.7188853 |
| text876 | 0.1774358 | 0.8214530 |
| text877 | 0.1874081 | 0.4400938 |
| text878 | 0.1377529 | 0.9005445 |
| text879 | 0.1761281 | 1.0042425 |
| text880 | 0.2206601 | 1.0241640 |
| text881 | 0.4473260 | 1.0019842 |
| text882 | 0.2646134 | 0.7171329 |
| text883 | 0.1999268 | 0.8299217 |
| text884 | 0.1929471 | 0.8019005 |
| text885 | 0.2248119 | 0.6289216 |
| text886 | 0.3041450 | 0.8832343 |
| text887 | 0.1126174 | 1.1282308 |
| text888 | 0.0331078 | 1.0052112 |
| text889 | 0.0430898 | 0.8217140 |
| text890 | 0.0111087 | 0.6849751 |
| text891 | -0.0609078 | 0.7740685 |
| text892 | 0.5989326 | 1.0971359 |
| text893 | 0.2383512 | 1.0155581 |
| text894 | 0.0168657 | 0.8816260 |
| text895 | 0.0041807 | 1.0344112 |
| text896 | -0.1486428 | 0.8688951 |
| text897 | -0.1735642 | 0.8374898 |
| text898 | 0.1769227 | 1.5348679 |
| text899 | 0.1279294 | 0.8323125 |
| text900 | 0.2597684 | 0.8416590 |
| text901 | 0.3122303 | 0.6529846 |
| text902 | 0.2691114 | 0.5164127 |
| text903 | 0.1869533 | 0.7131967 |
| text904 | 0.3886156 | 0.8649249 |
| text905 | 0.2988101 | 1.4067590 |
| text906 | -0.0366587 | 0.9685743 |
| text907 | -0.2359814 | 0.9806156 |
| text908 | -0.3708500 | 1.2107336 |
| text909 | -0.1308630 | 0.9024521 |
| text910 | 0.1107647 | 1.0428864 |
| text911 | -0.0305202 | 0.6968375 |
| text912 | 0.0004163 | 0.7365129 |
| text913 | 0.0049049 | 1.0012596 |
| text914 | 0.1910642 | 0.6244799 |
| text915 | 0.0864571 | 0.5409438 |
| text916 | 0.0639617 | 0.5449299 |
| text917 | 0.0161693 | 0.7200425 |
| text918 | 0.0546817 | 0.7471801 |
| text919 | 0.1317814 | 0.8690261 |
| text920 | 0.1984856 | 0.8776694 |
| text921 | 0.0579892 | 0.7303169 |
| text922 | 0.0991913 | 0.8808294 |
| text923 | 0.0495823 | 0.5440119 |
| text924 | -0.0963882 | 0.8580346 |
| text925 | -0.1146138 | 1.0042798 |
| text926 | 0.4352526 | 1.5416993 |
| text927 | 0.0161693 | 0.7200425 |
| text928 | 0.0546817 | 0.7471801 |
| text929 | 0.1317814 | 0.8690261 |
| text930 | 0.1984856 | 0.8776694 |
| text931 | 0.0579892 | 0.7303169 |
| text932 | 0.0991913 | 0.8808294 |
| text933 | 0.1527982 | 0.9291295 |
| text934 | 0.1011047 | 0.9190996 |
| text935 | -0.0106049 | 1.0224983 |
| text936 | 0.0577873 | 0.6039326 |
| text937 | 0.2645392 | 0.5996080 |
| text938 | -0.0305202 | 0.6968375 |
| text939 | 0.0004163 | 0.7365129 |
| text940 | 0.0049049 | 1.0012596 |
| text941 | 0.1910642 | 0.6244799 |
| text942 | 0.0864571 | 0.5409438 |
| text943 | 0.0639617 | 0.5449299 |
| text944 | 0.1929471 | 0.8019005 |
| text945 | 0.2248119 | 0.6289216 |
| text946 | 0.3041450 | 0.8832343 |
| text947 | 0.1126174 | 1.1282308 |
| text948 | 0.0331078 | 1.0052112 |
| text949 | 0.2882309 | 0.8197882 |
| text950 | 0.2668707 | 0.5579913 |
| text951 | 0.2635796 | 0.4328941 |
| text952 | 0.2999198 | 0.6933983 |
| text953 | 0.1254427 | 0.5375063 |
| text954 | 0.2446595 | 0.7568333 |
| text955 | 0.2478244 | 0.7200635 |
| text956 | 0.2049517 | 0.8824197 |
| text957 | 0.1632397 | 0.2829723 |
| text958 | 0.1438008 | 0.4967464 |
| text959 | 0.2501715 | 0.7697237 |
| text960 | 0.2441995 | 0.4308376 |
| text961 | 0.2650141 | 0.6186479 |
| text962 | 0.0921530 | 0.7576856 |
| text963 | 0.2526222 | 1.0325671 |
| text964 | 0.0488509 | 0.5711634 |
| text965 | 0.0558632 | 0.6962878 |
| text966 | 0.1072948 | 0.2771800 |
| text967 | -0.0312179 | 0.4257579 |
| text968 | 0.4061339 | 0.6542354 |
| text969 | 0.3137947 | 0.7613128 |
| text970 | 0.2563033 | 0.8413658 |
| text971 | 0.1513237 | 0.4418251 |
| text972 | 0.3830786 | 1.1140626 |
| text973 | 0.0116566 | 0.1413525 |
| text974 | 0.0558982 | 0.2262059 |
| text975 | 0.1190803 | 0.5848427 |
| text976 | 0.9556170 | 1.5019813 |
| text977 | 0.7341150 | 0.5978348 |
| text978 | 0.6399541 | 0.6357824 |
| text979 | 0.4255652 | 0.6500750 |
| text980 | 0.7567579 | 0.6876681 |
| text981 | 0.6605006 | 0.7041284 |
| text982 | 0.7060677 | 0.8298015 |
| text983 | 0.8273500 | 1.1013302 |
| text984 | 0.7383027 | 0.7980099 |
| text985 | 0.8638932 | 1.0746506 |
| text986 | 0.8969678 | 1.4316782 |
| text987 | 0.7919824 | 1.1276988 |
| text988 | 0.6704601 | 1.0454130 |
| text989 | 0.4596731 | 0.8123483 |
| text990 | 0.4969567 | 1.0984201 |
| text991 | 0.3776802 | 0.9567384 |
| text992 | 0.4636210 | 0.9007091 |
| text993 | 0.4828600 | 1.2949273 |
| text994 | 0.1996140 | 0.5428587 |
| text995 | 0.3273406 | 0.9073804 |
| text996 | 0.3705648 | 0.7388537 |
| text997 | 0.6808199 | 0.7731386 |
| text998 | 0.5037611 | 0.8348965 |
| text999 | 0.5209010 | 0.9911370 |
| text1000 | 0.4685188 | 0.8011412 |
| text1001 | 0.3467764 | 0.8739349 |
| text1002 | 0.3117821 | 0.7192772 |
| text1003 | 0.5199214 | 0.8617783 |
| text1004 | 0.2173844 | 0.5408450 |
| text1005 | 0.3558577 | 0.8042818 |
| text1006 | 0.3880777 | 0.8478299 |
| text1007 | 0.2773050 | 0.6335116 |
| text1008 | 0.4481198 | 0.9848854 |
| text1009 | 0.2959565 | 0.7224624 |
| text1010 | 0.2465385 | 0.9017219 |
| text1011 | 0.3465668 | 1.0053493 |
| text1012 | 0.4101770 | 0.9171531 |
| text1013 | 0.2828262 | 0.5186033 |
| text1014 | 0.3370867 | 0.7338734 |
| text1015 | 0.5711239 | 1.1602673 |
| text1016 | 0.3859683 | 0.9991950 |
| text1017 | 0.3630091 | 0.8051278 |
| text1018 | 0.1000213 | 1.0057838 |
| text1019 | 0.7326869 | 1.4609571 |
| text1020 | 0.5823187 | 1.2857372 |
| text1021 | 0.6185886 | 1.1481452 |
| text1022 | 0.4907400 | 1.2397095 |
| text1023 | 0.3763678 | 1.0797425 |
| text1024 | 0.6401451 | 1.0755904 |
| text1025 | 0.6449360 | 0.7688940 |
| text1026 | 0.6765661 | 1.1335027 |
| text1027 | 0.7196561 | 1.1046540 |
| text1028 | 0.4736236 | 1.0327566 |
| text1029 | 0.4918629 | 0.8459902 |
| text1030 | 0.4930953 | 1.0660720 |
| text1031 | 0.7032116 | 1.0290643 |
| text1032 | 0.3717980 | 0.9005245 |
| text1033 | 0.7694306 | 1.0545579 |
| text1034 | 0.6010200 | 0.8896693 |
| text1035 | 0.5980224 | 0.8618548 |
| text1036 | 0.5879195 | 0.8803446 |
| text1037 | 0.4798589 | 0.8452709 |
| text1038 | -0.0516359 | 0.2990228 |
| text1039 | 0.1113496 | 0.4733409 |
| text1040 | 0.4372269 | 0.8173486 |
| text1041 | 0.3619660 | 0.4586215 |
| text1042 | 0.5186942 | 0.7354203 |
| text1043 | 0.7139449 | 1.0229387 |
| text1044 | 0.3747703 | 0.8379032 |
| text1045 | 0.4818524 | 0.9854122 |
| text1046 | 0.2624385 | 0.5531961 |
| text1047 | 0.4547336 | 0.9032075 |
| text1048 | 0.6886052 | 1.3136436 |
| text1049 | 0.7008228 | 0.9861793 |
| text1050 | 0.5984056 | 1.1167205 |
| text1051 | 0.3762996 | 0.5316908 |
| text1052 | 0.4073926 | 0.6164035 |
| text1053 | 0.4465661 | 0.8769828 |
| text1054 | 0.4236611 | 0.5555259 |
| text1055 | 0.4020720 | 0.6661735 |
| text1056 | 0.4871885 | 1.0598434 |
| text1057 | 0.3101423 | 0.7848663 |
| text1058 | 0.3978962 | 0.7424070 |
| text1059 | 0.7685117 | 0.7104994 |
| text1060 | 0.8889325 | 0.5482955 |
| text1061 | 0.4728544 | 0.6744110 |
| text1062 | 0.1792930 | 0.6527017 |
| text1063 | 0.5000012 | 0.4619674 |
| text1064 | 0.5369809 | 0.9085068 |
| text1065 | 0.5232416 | 0.5625008 |
| text1066 | 0.6176406 | 1.0007969 |
| text1067 | 0.7623362 | 0.9349223 |
| text1068 | 0.6508048 | 0.5999654 |
| text1069 | 0.5345428 | 0.8717888 |
| text1070 | 0.4038993 | 0.8206642 |
| text1071 | 0.4670756 | 0.9466744 |
| text1072 | 0.6361985 | 1.2246951 |
| text1073 | 0.4576566 | 0.7795608 |
| text1074 | 0.2580253 | 0.7531904 |
| text1075 | 0.2047746 | 0.3843882 |
| text1076 | 0.3416284 | 0.6136033 |
| text1077 | 0.4850900 | 0.9438475 |
| text1078 | 0.6536183 | 1.1246368 |
| text1079 | 0.3340689 | 0.8402489 |
| text1080 | 0.3488996 | 0.9422796 |
| text1081 | 0.2495998 | 1.0238669 |
| text1082 | -0.4464760 | 0.8319432 |
| text1083 | 0.4850900 | 0.9438475 |
| text1084 | 0.6536183 | 1.1246368 |
| text1085 | 0.3340689 | 0.8402489 |
| text1086 | 0.3488996 | 0.9422796 |
| text1087 | 0.2495998 | 1.0238669 |
| text1088 | -0.4464760 | 0.8319432 |
| text1089 | 0.4675603 | 0.8756593 |
| text1090 | 0.4922335 | 0.8399760 |
| text1091 | 0.4980668 | 0.6114231 |
| text1092 | 0.3786363 | 0.4816644 |
| text1093 | 0.4419812 | 0.8896790 |
| text1094 | 0.4321933 | 0.8460936 |
| text1095 | 0.6097248 | 0.8807376 |
| text1096 | 0.4443849 | 0.5409642 |
| text1097 | 0.4434158 | 0.5419852 |
| text1098 | 0.6630806 | 0.8228476 |
| text1099 | 0.4569602 | 0.7793104 |
| text1100 | 0.4098792 | 0.8800261 |
| text1101 | 0.5125908 | 0.5408970 |
| text1102 | 0.5931482 | 1.0241945 |
| text1103 | 0.6490191 | 0.3902579 |
| text1104 | 0.4735247 | 0.6685994 |
| text1105 | 0.4602170 | 0.6672829 |
| text1106 | 0.4330883 | 0.7026179 |
| text1107 | 0.5459103 | 0.7543338 |
| text1108 | 0.5843048 | 0.9956081 |
| text1109 | 0.3491090 | 0.6482427 |
| text1110 | 0.3262695 | 0.5931589 |
| text1111 | 0.2373179 | 0.5230695 |
| text1112 | 0.4412598 | 0.5254622 |
| text1113 | 0.6122329 | 0.6483832 |
| text1114 | 0.5767313 | 0.6761171 |
| text1115 | 0.6298665 | 0.6833563 |
| text1116 | 0.5116633 | 0.6537740 |
| text1117 | 0.3335503 | 0.3528685 |
| text1118 | 0.5124587 | 0.8642291 |
| text1119 | 0.6026028 | 1.0587616 |
| text1120 | 0.5540885 | 1.0495628 |
| text1121 | 0.5907860 | 1.2403588 |
| text1122 | 0.7687974 | 1.5343758 |
| text1123 | 0.3378379 | 0.8840611 |
| text1124 | 0.3263048 | 0.8990484 |
| text1125 | 0.4254266 | 0.9503149 |
| text1126 | 0.4012013 | 0.8619733 |
| text1127 | 0.3509070 | 0.7396910 |
| text1128 | 0.2554830 | 0.7624654 |
| text1129 | 0.5886994 | 1.3272138 |
| text1130 | 0.3192536 | 0.6013434 |
| text1131 | 0.0853447 | 0.5886324 |
| text1132 | 0.0825915 | 0.5286109 |
| text1133 | 0.5187907 | 0.7910674 |
| text1134 | 0.4723827 | 0.7805765 |
| text1135 | 0.4095675 | 0.8191149 |
| text1136 | 0.4977890 | 1.0907801 |
| text1137 | 0.3801342 | 0.8165005 |
| text1138 | 0.4842129 | 0.9171598 |
| text1139 | 0.3449679 | 0.8854958 |
| text1140 | 0.3801308 | 0.7386984 |
| text1141 | 0.2940991 | 0.7477026 |
| text1142 | 0.3933645 | 0.7155836 |
| text1143 | 0.3812908 | 0.7147556 |
| text1144 | 0.4197417 | 0.9049175 |
| text1145 | 0.4200915 | 0.7072135 |
| text1146 | 0.6306144 | 0.6827676 |
| text1147 | 0.6260446 | 1.0540602 |
| text1148 | 0.5807328 | 0.9729075 |
| text1149 | 0.6436635 | 1.1796567 |
| text1150 | 0.5553035 | 1.1362317 |
| text1151 | 0.4944961 | 0.7593424 |
| text1152 | 0.4302283 | 0.9777280 |
| text1153 | 0.4826866 | 1.2436173 |
| text1154 | 0.4971727 | 0.8829789 |
| text1155 | 0.5803158 | 0.6124797 |
| text1156 | 0.4951475 | 0.8122167 |
| text1157 | 0.7029683 | 0.8318474 |
| text1158 | 0.5722536 | 0.8178649 |
| text1159 | 0.4790074 | 0.7340767 |
| text1160 | 0.6209798 | 0.9127836 |
| text1161 | 0.4987056 | 0.7961784 |
| text1162 | 0.4595917 | 0.8452245 |
| text1163 | 0.5262375 | 0.9951944 |
| text1164 | 0.5180397 | 0.6828003 |
| text1165 | 0.3457086 | 0.7604502 |
| text1166 | 0.2876162 | 0.7260256 |
| text1167 | 0.4088749 | 0.8098687 |
| text1168 | 0.3184444 | 0.7836430 |
| text1169 | 0.3657210 | 0.6169908 |
| text1170 | 0.3236406 | 0.5403600 |
| text1171 | 0.5210824 | 0.8958498 |
| text1172 | 0.4389011 | 0.8562841 |
| text1173 | 0.4084688 | 0.8327357 |
| text1174 | 0.3184444 | 0.7836430 |
| text1175 | 0.3657210 | 0.6169908 |
| text1176 | 0.3236406 | 0.5403600 |
| text1177 | 0.5210824 | 0.8958498 |
| text1178 | 0.4389011 | 0.8562841 |
| text1179 | 0.4084688 | 0.8327357 |
| text1180 | 0.3811203 | 0.9149197 |
| text1181 | 0.4366308 | 0.6625473 |
| text1182 | 0.4257082 | 0.7460521 |
| text1183 | 0.3559959 | 0.6793655 |
| text1184 | 0.4490312 | 0.8507246 |
| text1185 | 0.7064158 | 1.2199326 |
| text1186 | 0.7328755 | 1.3088700 |
| text1187 | 0.5429234 | 0.8478879 |
| text1188 | 0.5431755 | 0.6950550 |
| text1189 | 0.5429234 | 0.8478879 |
| text1190 | 0.5431755 | 0.6950550 |
| text1191 | 0.4433035 | 0.7455887 |
| text1192 | 0.2461891 | 0.5305148 |
| text1193 | 0.4142394 | 0.7001659 |
| text1194 | 0.5106996 | 0.8936388 |
| text1195 | 0.7370804 | 0.8275862 |
| text1196 | 0.6063178 | 0.7712728 |
| text1197 | 0.5291761 | 0.6255554 |
| text1198 | 0.2935064 | 0.7234958 |
| text1199 | 0.3847740 | 0.7700452 |
| text1200 | 0.3392434 | 0.7870717 |
| text1201 | 0.4136628 | 0.8227358 |
| text1202 | 0.4377409 | 0.7091969 |
| text1203 | 0.6472415 | 0.7783658 |
| text1204 | 0.2338608 | 0.4128390 |
| text1205 | 0.3765079 | 0.8330499 |
| text1206 | 0.2454109 | 0.5597863 |
| text1207 | 0.1718784 | 0.3186837 |
| text1208 | 0.3837925 | 0.4467617 |
| text1209 | 0.3522638 | 0.3105817 |
| text1210 | 0.3023176 | 0.4017939 |
| text1211 | 0.2685977 | 0.4185607 |
| text1212 | 0.1070924 | 0.2541592 |
| text1213 | 0.0815793 | 0.1968444 |
| text1214 | 0.2570472 | 0.3618775 |
| text1215 | 0.2805580 | 0.2450973 |
| text1216 | 0.4658345 | 0.7120160 |
| text1217 | 0.2968294 | 0.7495417 |
| text1218 | 0.5714479 | 0.9512465 |
| text1219 | 0.2888027 | 0.6811080 |
| text1220 | 0.4156950 | 0.7872503 |
| text1221 | 0.6066815 | 0.8467729 |
| text1222 | 0.4079486 | 0.6966146 |
| text1223 | 0.3444654 | 0.5661159 |
| text1224 | 0.4701212 | 0.7288430 |
| text1225 | 0.5889371 | 1.0342727 |
| text1226 | 0.4462932 | 1.0677536 |
| text1227 | 0.6497625 | 0.9715934 |
| text1228 | 0.5096141 | 1.0647988 |
| text1229 | 0.4469320 | 0.8101636 |
| text1230 | 0.4871751 | 1.0071243 |
| text1231 | 0.6322379 | 1.2470515 |
| text1232 | 0.5208369 | 0.7764376 |
| text1233 | 0.3568565 | 0.5921445 |
| text1234 | 0.3308087 | 0.5718746 |
| text1235 | 0.4069285 | 0.6266586 |
| text1236 | 0.5197054 | 0.9444010 |
| text1237 | 0.4249634 | 0.7382166 |
| text1238 | 0.4089549 | 0.6593645 |
| text1239 | 0.5208797 | 0.8708357 |
| text1240 | 0.3064608 | 0.5506555 |
| text1241 | 0.5119162 | 0.8862140 |
| text1242 | 0.3542825 | 0.8379725 |
| text1243 | 0.4236053 | 0.6149595 |
| text1244 | 0.3232400 | 0.5014868 |
| text1245 | 0.3098875 | 0.7212148 |
| text1246 | 0.5462301 | 1.0115825 |
| text1247 | 0.7381672 | 1.1380273 |
| text1248 | 0.5839165 | 1.2008184 |
| text1249 | 0.3619772 | 0.7619639 |
| text1250 | 0.6672652 | 1.2545918 |
| text1251 | 0.3967563 | 0.9980389 |
| text1252 | 0.9185008 | 1.3514360 |
| text1253 | 0.7560453 | 1.1760964 |
| text1254 | 0.6437355 | 1.5605342 |
| text1255 | 0.2451233 | 0.8765886 |
| text1256 | 0.3790796 | 0.8011465 |
| text1257 | 0.4443149 | 0.6797925 |
| text1258 | 0.4868314 | 0.9899252 |
| text1259 | 0.4662120 | 0.9886856 |
| text1260 | 0.3752628 | 0.4714049 |
| text1261 | 0.3734026 | 0.4453101 |
| text1262 | 0.3620169 | 0.0153431 |
| text1263 | 0.6053820 | 1.2089461 |
| text1264 | 0.6262171 | 0.8777306 |
| text1265 | 0.5441530 | 0.9836482 |
| text1266 | 0.7948067 | 1.1906782 |
| text1267 | 0.5757203 | 1.3437161 |
| text1268 | 0.6570913 | 1.4946150 |
| text1269 | 0.3448058 | 1.0003400 |
| text1270 | 0.3746929 | 1.0218018 |
| text1271 | 0.4808913 | 1.1336475 |
| text1272 | 0.5556430 | 1.0411537 |
| text1273 | 0.4074205 | 0.9566968 |
| text1274 | 0.5497702 | 1.0846609 |
| text1275 | 0.5301559 | 1.2474441 |
| text1276 | 0.6067595 | 1.0063540 |
| text1277 | 0.7234179 | 0.7267535 |
| text1278 | 0.6084739 | 0.8959663 |
| text1279 | 0.6987773 | 0.9262750 |
| text1280 | 0.7324654 | 0.9593330 |
| text1281 | 0.5367070 | 1.1567370 |
| text1282 | 0.5094107 | 1.0220322 |
| text1283 | 0.5682523 | 1.0902645 |
| text1284 | 0.6241741 | 1.2168034 |
| text1285 | 0.5478283 | 1.0485693 |
| text1286 | 0.5887441 | 1.0588233 |
| text1287 | 0.1062905 | 0.8079255 |
| text1288 | 0.4863102 | 1.2169509 |
| text1289 | 0.2792745 | 0.9517879 |
| text1290 | 0.3207414 | 0.5868180 |
| text1291 | 0.3409718 | 1.0513398 |
| text1292 | 0.3259553 | 0.9165819 |
| text1293 | 0.2657201 | 1.0723255 |
| text1294 | 0.5449225 | 0.7638061 |
| text1295 | 0.4074500 | 0.5948523 |
| text1296 | 0.3256999 | 0.5289792 |
| text1297 | 0.3820331 | 0.6491828 |
| text1298 | 0.3700306 | 0.7852121 |
| text1299 | 0.5325045 | 0.8788488 |
| text1300 | 0.8612903 | 1.1569171 |
| text1301 | 0.7073862 | 1.0865785 |
| text1302 | 0.6240114 | 0.8991698 |
| text1303 | 0.4782200 | 0.7081556 |
| text1304 | 0.7250951 | 1.2481446 |
| text1305 | 0.5789770 | 1.1691418 |
| text1306 | 0.5598084 | 1.2178843 |
| text1307 | 0.4983659 | 0.7838780 |
| text1308 | 0.4272834 | 0.8042568 |
| text1309 | 0.5717991 | 1.1768010 |
| text1310 | 0.3985405 | 0.8282756 |
| text1311 | 0.6971531 | 1.2588930 |
| text1312 | 0.5096656 | 1.0254855 |
| text1313 | 0.5124559 | 1.0030849 |
| text1314 | 0.4468850 | 1.1352480 |
| text1315 | 0.5183904 | 0.9591392 |
| text1316 | 0.3455384 | 0.5493170 |
| text1317 | 0.5730362 | 0.8790771 |
| text1318 | 0.5414975 | 0.6226628 |
| text1319 | 0.6137263 | 0.8400668 |
| text1320 | 0.5758211 | 0.8513404 |
| text1321 | 0.4906091 | 0.8168579 |
| text1322 | 0.2965517 | 0.4203007 |
| text1323 | 0.4417066 | 0.6727935 |
| text1324 | 0.3688247 | 0.6623425 |
| text1325 | 0.2200838 | 0.3848597 |
| text1326 | 0.2147992 | 0.7006458 |
| text1327 | 0.1839732 | 0.5308027 |
| text1328 | 0.3990303 | 0.9681914 |
| text1329 | 0.2215507 | 0.6641758 |
| text1330 | 0.4121565 | 1.1804082 |
| text1331 | 0.4259757 | 1.0393017 |
| text1332 | 0.2488739 | 0.6090058 |
| text1333 | 0.4465400 | 0.8800252 |
| text1334 | 0.4190218 | 1.4378029 |
| text1335 | 0.2935338 | 0.7375830 |
| text1336 | 0.1433261 | 0.4832456 |
| text1337 | 0.5075027 | 1.1382832 |
| text1338 | 0.3889097 | 1.0327088 |
| text1339 | 0.3470004 | 0.6798583 |
| text1340 | 0.5126599 | 0.6875579 |
| text1341 | 0.3233763 | 0.5088325 |
| text1342 | 0.3371297 | 0.7475493 |
| text1343 | 0.2277098 | 0.4588405 |
| text1344 | 0.4631662 | 0.7181470 |
| text1345 | 0.2655277 | 0.6622346 |
| text1346 | 0.2557168 | 0.5183452 |
| text1347 | 0.2296727 | 0.7969645 |
| text1348 | 0.6808199 | 0.7731386 |
| text1349 | 0.5037611 | 0.8348965 |
| text1350 | 0.5209010 | 0.9911370 |
| text1351 | 0.4685188 | 0.8011412 |
| text1352 | 0.3467764 | 0.8739349 |
| text1353 | 0.5541853 | 0.7749275 |
| text1354 | 0.3316030 | 0.6299141 |
| text1355 | 0.3658704 | 0.5166822 |
| text1356 | 0.5662878 | 0.7548715 |
| text1357 | 0.6809355 | 0.8935869 |
| text1358 | 0.3094920 | 0.8653413 |
| text1359 | 0.1940733 | 0.4289464 |
| text1360 | 0.3489993 | 0.6989711 |
| text1361 | 0.1807608 | 0.8311503 |
| text1362 | 0.4235368 | 1.0026364 |
| text1363 | 0.3900103 | 0.8452532 |
| text1364 | 0.4045742 | 0.7586576 |
| text1365 | 0.4381700 | 0.8482954 |
| text1366 | 0.4009907 | 0.6863336 |
| text1367 | 0.4996123 | 0.8424200 |
| text1368 | 0.6413572 | 0.8303514 |
| text1369 | 0.4401879 | 0.4159222 |
| text1370 | 0.6785417 | 1.0437676 |
| text1371 | 0.5117662 | 0.6507779 |
| text1372 | 1.1711839 | 1.7491492 |
| text1373 | 0.3797962 | 0.8013584 |
| text1374 | 0.3997840 | 0.7570329 |
| text1375 | 0.5120791 | 0.7721850 |
| text1376 | 0.3992387 | 0.7497734 |
| text1377 | 0.3601948 | 0.6923687 |
| text1378 | 0.3745622 | 1.1039798 |
| text1379 | 0.3044350 | 0.5895177 |
| text1380 | 0.0964785 | 0.6646005 |
| text1381 | 0.2687068 | 0.9382980 |
| text1382 | 0.2717765 | 0.9337493 |
| text1383 | 0.1296084 | 0.7177243 |
| text1384 | 0.3118601 | 0.6312552 |
| text1385 | 0.6153475 | 1.1272251 |
| text1386 | 0.6018057 | 1.1533714 |
| text1387 | 0.4506303 | 0.5389551 |
| text1388 | 0.4661763 | 0.4882178 |
| text1389 | 0.4650129 | 0.9732338 |
| text1390 | 0.5963218 | 1.0179407 |
| text1391 | 0.4179313 | 1.4483667 |
| text1392 | 0.4020670 | 1.2449892 |
| text1393 | 0.4810933 | 1.2023717 |
| text1394 | 0.5532975 | 1.4823006 |
| text1395 | 0.5905071 | 1.2943618 |
| text1396 | 0.4414289 | 0.9650619 |
| text1397 | 0.4309913 | 1.1344105 |
| text1398 | 0.4839175 | 1.0931449 |
| text1399 | 0.4050441 | 1.0115013 |
| text1400 | 0.1819200 | 0.9554525 |
| text1401 | 0.4832184 | 1.4295395 |
| text1402 | 0.3553919 | 0.7557355 |
| text1403 | 0.5584590 | 0.9626443 |
| text1404 | 0.5892969 | 0.9557247 |
| text1405 | 0.6397185 | 0.8717217 |
| text1406 | 0.6072085 | 1.0076287 |
| text1407 | 0.5685782 | 0.8798364 |
| text1408 | 0.5821683 | 0.7240726 |
| text1409 | 0.4063597 | 0.4650703 |
| text1410 | 0.6112605 | 0.9610064 |
| text1411 | 0.6413885 | 0.9738002 |
| text1412 | 0.4165474 | 0.7248671 |
| text1413 | 0.4460596 | 0.8218484 |
| text1414 | 0.4006526 | 0.6871814 |
| text1415 | 0.6090311 | 1.0754014 |
| text1416 | 0.0324851 | 1.6214489 |
| text1417 | 0.4847310 | 1.2295748 |
| text1418 | 0.2460864 | 1.4184868 |
| text1419 | -0.0812260 | 1.3114370 |
| text1420 | 0.3962331 | 1.8826016 |
| text1421 | 0.0324851 | 1.6214489 |
| text1422 | 0.4847310 | 1.2295748 |
| text1423 | 0.2460864 | 1.4184868 |
| text1424 | -0.0812260 | 1.3114370 |
| text1425 | 0.3962331 | 1.8826016 |
| text1426 | 0.3094544 | 0.8088875 |
| text1427 | 0.3657627 | 0.6604129 |
| text1428 | 0.3172758 | 0.6651387 |
| text1429 | 0.4479558 | 1.0702439 |
| text1430 | 0.4987585 | 0.7884085 |
| text1431 | 0.6523556 | 0.9202384 |
| text1432 | 0.4141143 | 0.7598325 |
| text1433 | 0.4520147 | 0.8039657 |
| text1434 | 0.4999550 | 0.9082192 |
| text1435 | 0.5066380 | 0.7047445 |
| text1436 | 0.4599796 | 0.7577719 |
| text1437 | 0.3944105 | 0.8251186 |
| text1438 | 0.4809447 | 1.3174141 |
| text1439 | 0.4827465 | 1.1113954 |
| text1440 | 0.5501568 | 1.2410112 |
| text1441 | 0.5984704 | 1.2524805 |
| text1442 | 0.4847358 | 0.8345793 |
| text1443 | 0.4964944 | 0.8532351 |
| text1444 | 0.3462591 | 0.6114247 |
| text1445 | 0.4975526 | 0.9023326 |
| text1446 | 0.4245850 | 0.7131651 |
| text1447 | 0.6254309 | 0.9936817 |
| text1448 | 0.4653311 | 0.8180322 |
| text1449 | 0.5250004 | 0.7302974 |
| text1450 | 0.4241172 | 0.5105978 |
| text1451 | 0.3299795 | 0.3716462 |
| text1452 | 0.6052241 | 0.4112939 |
| text1453 | 0.8040817 | 1.3182165 |
| text1454 | 0.9141456 | 1.6329141 |
| text1455 | 0.7744198 | 0.8876842 |
| text1456 | 0.9033484 | 1.3668805 |
| text1457 | 1.0886302 | 1.4827186 |
| text1458 | 0.5296191 | 0.7170489 |
| text1459 | 0.4607230 | 0.7897376 |
| text1460 | 0.3364791 | 0.8268564 |
| text1461 | 0.3528244 | 0.6895467 |
| text1462 | 0.3489906 | 0.8275633 |
| text1463 | 0.5282403 | 0.7396214 |
| text1464 | 0.1990451 | 0.4889690 |
| text1465 | 0.4809734 | 0.8948698 |
| text1466 | 0.5134132 | 1.0130642 |
| text1467 | 0.5055044 | 1.0857737 |
| text1468 | 0.4539222 | 1.0883784 |
| text1469 | 0.7686705 | 0.9562201 |
| text1470 | 0.3982378 | 0.9074335 |
| text1471 | 0.4415073 | 1.1602723 |
Now, we make the representation of the documents and use different color to represent the document in diffrent category.
TED.de <- as.data.frame(TED.de)
TED.de.source <- TED_full %>%
select(2,3,4) %>% cbind(TED.de)
ggplot(data=TED.de.source,mapping = aes(
x=V1,
y=V2,
color=cate))+
geom_point()+
labs(x = "dimension1",
y = "dimension2")+
scale_colour_discrete(
name="Category",
breaks=c("1","2","3"),
labels=c("AI","Climate change","Relationships")
)
According to this plot, the documents in the category “AI” and
“relationships” covers the largest area of each other. It is possible
that the documents in these two categories are more similar when
compared with the documents in the category “Climate change”.
For Supervised Learning, we use the results from the LSA analysis in the previous section and build a prediction model to predict the category using a random forest technique. Reducing the dimensionality of the data can be beneficial when running supervised learning algorithms because it can help to improve the performance of the algorithm by reducing the complexity of the data and to make it easier to interpret the results.
In this section, we will apply the data from LSA on TF in supervised learning. We use the data frame consisting of the category and the “doc” matrix from the LSA. Then, we build the training set index based on a 80/20 split.
a <- c(1:nrow(TED.lsa.source))
row.names(TED.lsa.source) <- a
TED.lsa.source$cate <- as.factor(TED.lsa.source$cate)
set.seed(123)
index.tr <- createDataPartition(y = TED.lsa.source$cate, p= 0.8, list = FALSE)
TED.tr <- TED.lsa.source[index.tr,]
TED.te <- TED.lsa.source[-index.tr,]
The data is unbalanced so we need to use the sub-sampling method to balance the data. The below table shows that there are 450, 327 and 401 observations for Topic 1 (AI), 2 (Climate) and 3 (Relationship) respectively.
kable(as.data.frame(table(TED.tr$cate)) %>% rename(Topic = Var1, Count = Freq) , caption = "The number of observations per Topic") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| Topic | Count |
|---|---|
| 1 | 450 |
| 2 | 327 |
| 3 | 401 |
After sub-sampling, the number of observations for each category is 327.
set.seed(123)
n2 <- min(table(TED.tr$cate)) ## 327
TED.tr.1 <- filter(TED.tr, cate=="1") ## the category 1
TED.tr.2 <- filter(TED.tr, cate=="2") ## the category 2
TED.tr.3 <- filter(TED.tr, cate=="3") ## the category 3
index.1 <- sample(size=n2, x=1:nrow(TED.tr.1), replace=FALSE)
index.3 <- sample(size=n2, x=1:nrow(TED.tr.3), replace=FALSE)
TED.tr.subs <- data.frame(rbind(TED.tr.1[index.1,],
TED.tr.2,
TED.tr.3[index.3,]))
kable(as.data.frame(table(TED.tr.subs$cate)) %>% rename(Topic = Var1, Count = Freq) , caption = "The number of observations per Topic, after sub-sampling") %>%
kable_styling(bootstrap_options = "bordered") %>%
kableExtra::scroll_box(width = "100%", height = "250px")
| Topic | Count |
|---|---|
| 1 | 327 |
| 2 | 327 |
| 3 | 327 |
We now use a random forest to predict the category from the LSA on TF. Then, the model accuracy is inspected on the test set. The results below show a confusion matrix and the associated statistics.
TED.fit <- ranger(TED.tr.subs$cate ~ .,
data = TED.tr.subs[2:6])
pred.te <- predict(TED.fit, TED.te)
confusionMatrix(data=pred.te$predictions, reference = TED.te$cate)
## Confusion Matrix and Statistics
##
## Reference
## Prediction 1 2 3
## 1 81 4 6
## 2 11 71 9
## 3 20 6 85
##
## Overall Statistics
##
## Accuracy : 0.8089
## 95% CI : (0.7591, 0.8523)
## No Information Rate : 0.3823
## P-Value [Acc > NIR] : < 2.2e-16
##
## Kappa : 0.7131
##
## Mcnemar's Test P-Value : 0.009725
##
## Statistics by Class:
##
## Class: 1 Class: 2 Class: 3
## Sensitivity 0.7232 0.8765 0.8500
## Specificity 0.9448 0.9057 0.8653
## Pos Pred Value 0.8901 0.7802 0.7658
## Neg Pred Value 0.8465 0.9505 0.9176
## Prevalence 0.3823 0.2765 0.3413
## Detection Rate 0.2765 0.2423 0.2901
## Detection Prevalence 0.3106 0.3106 0.3788
## Balanced Accuracy 0.8340 0.8911 0.8576
According to the confusion matrix, the accuracy is 0.8089 and the balanced accuracy for class 1 is 0.8340, for class 2 is 0.8911, and for class 3 is 0.8576.
Now we use LSA on TF-IDF instead and repeat the same steps with the sub-sampling technique and the random forest model. The results below show a confusion matrix and the associated statistics.
TED.lsa2.source <- cbind(document,TED.lsa2.source)
row.names(TED.lsa2.source) <- a
TED.lsa2.source$cate <- as.factor(TED.lsa2.source$cate)
set.seed(123)
index.tr <- createDataPartition(y = TED.lsa2.source$cate, p= 0.8, list = FALSE)
TED.tr <- TED.lsa2.source[index.tr,]
TED.te <- TED.lsa2.source[-index.tr,]
n2 <- min(table(TED.tr$cate)) ## 327
TED.tr.1 <- filter(TED.tr, cate=="1") ## the category 1
TED.tr.2 <- filter(TED.tr, cate=="2") ## the category 2
TED.tr.3 <- filter(TED.tr, cate=="3") ## the category 3
index.1 <- sample(size=n2, x=1:nrow(TED.tr.1), replace=FALSE)
index.3 <- sample(size=n2, x=1:nrow(TED.tr.3), replace=FALSE)
TED.tr.subs <- data.frame(rbind(TED.tr.1[index.1,],
TED.tr.2,
TED.tr.3[index.3,]))
TED.fit2 <- ranger(TED.tr.subs$cate ~ .,
data = TED.tr.subs[2:6])
pred.te <- predict(TED.fit2, TED.te)
confusionMatrix(data=pred.te$predictions, reference = TED.te$cate)
## Confusion Matrix and Statistics
##
## Reference
## Prediction 1 2 3
## 1 91 3 3
## 2 5 73 3
## 3 16 5 94
##
## Overall Statistics
##
## Accuracy : 0.8805
## 95% CI : (0.8378, 0.9154)
## No Information Rate : 0.3823
## P-Value [Acc > NIR] : < 2e-16
##
## Kappa : 0.8198
##
## Mcnemar's Test P-Value : 0.01948
##
## Statistics by Class:
##
## Class: 1 Class: 2 Class: 3
## Sensitivity 0.8125 0.9012 0.9400
## Specificity 0.9669 0.9623 0.8912
## Pos Pred Value 0.9381 0.9012 0.8174
## Neg Pred Value 0.8929 0.9623 0.9663
## Prevalence 0.3823 0.2765 0.3413
## Detection Rate 0.3106 0.2491 0.3208
## Detection Prevalence 0.3311 0.2765 0.3925
## Balanced Accuracy 0.8897 0.9317 0.9156
According to the confusion matrix, The accuracy is 0.8805 and the balanced accuracy for class 1 is 0.8897, for class 2 is 0.9317, and for class 3 is 0.9156. Thus, we can conclude that the model build on features using LSA on TF-ITF has a higher accurancy than the model build on features using LSA on TF.
For the next step, we use the document embedding from the Embedding section as features in the random forest model. Before running the model, we also use the sub-sampling technique to balance the data.
The results below show a confusion matrix and associated statistics.
row.names(TED.de.source) <- a
TED.de.source$cate <- as.factor(TED.de.source$cate)
set.seed(123)
TED.Emb.tr <- TED.de.source[index.tr,c('cate','V1','V2')]
TED.Emb.te <- TED.de.source[-index.tr,c('cate','V1','V2')]
set.seed(123)
n2 <- min(table(TED.tr$cate)) ## 327
TED.Emb.tr.1 <- filter(TED.Emb.tr, cate=="1") ## the category 1
TED.Emb.tr.2 <- filter(TED.Emb.tr, cate=="2") ## the category 2
TED.Emb.tr.3 <- filter(TED.Emb.tr, cate=="3") ## the category 3
index.1 <- sample(size=n2, x=1:nrow(TED.Emb.tr.1), replace=FALSE)
index.3 <- sample(size=n2, x=1:nrow(TED.Emb.tr.3), replace=FALSE)
TED.Emb.tr.subs <- data.frame(rbind(TED.Emb.tr.1[index.1,],
TED.Emb.tr.2,
TED.Emb.tr.3[index.3,]))
set.seed(123)
TED.Emb.fit <- ranger(TED.Emb.tr.subs$cate~.,
data = TED.Emb.tr.subs)
pred.Emb.te <- predict(TED.Emb.fit, TED.Emb.te)
confusionMatrix(data=pred.Emb.te$predictions, reference = TED.Emb.te$cate)
## Confusion Matrix and Statistics
##
## Reference
## Prediction 1 2 3
## 1 43 9 20
## 2 33 59 17
## 3 36 13 63
##
## Overall Statistics
##
## Accuracy : 0.5631
## 95% CI : (0.5042, 0.6207)
## No Information Rate : 0.3823
## P-Value [Acc > NIR] : 2.784e-10
##
## Kappa : 0.3506
##
## Mcnemar's Test P-Value : 0.000298
##
## Statistics by Class:
##
## Class: 1 Class: 2 Class: 3
## Sensitivity 0.3839 0.7284 0.6300
## Specificity 0.8398 0.7642 0.7461
## Pos Pred Value 0.5972 0.5413 0.5625
## Neg Pred Value 0.6878 0.8804 0.7956
## Prevalence 0.3823 0.2765 0.3413
## Detection Rate 0.1468 0.2014 0.2150
## Detection Prevalence 0.2457 0.3720 0.3823
## Balanced Accuracy 0.6119 0.7463 0.6881
We find that the accuracy of this model is only 0.5597 and the balanced accuracy for class 1 is 0.6410, for class 2 is 0.6651, and for class 3 is 0.7136, which are much lower than the accuracies from the previous models.
From the first 3 models, we find that the model with LSA (TF-IDF) has the highest accuracy. In this section, we will add additional information, including the number of likes and the number of views, as additional features in the supervised learning model.
After balancing the data, we train the model with random forest techniques and get the results as shown below.
TED.de.source <- TED.de.source %>%
rename(V5=V1,V6=V2)
TED.LSA.Emb <- TED.lsa2.source %>%
cbind(TED.de.source) %>%
select(-7)
TED.Com.tr <- TED.LSA.Emb[index.tr,]
TED.Com.te <- TED.LSA.Emb[-index.tr,]
set.seed(123)
TED.Com.tr.1 <- filter(TED.Com.tr, cate=="1") ## the category 1
TED.Com.tr.2 <- filter(TED.Com.tr, cate=="2") ## the category 2
TED.Com.tr.3 <- filter(TED.Com.tr, cate=="3") ## the category 3
index.1 <- sample(size=n2, x=1:nrow(TED.Com.tr.1), replace=FALSE)
index.3 <- sample(size=n2, x=1:nrow(TED.Com.tr.3), replace=FALSE)
TED.Com.tr.subs <- data.frame(rbind(TED.Com.tr.1[index.1,],
TED.Com.tr.2,
TED.Com.tr.3[index.3,]))
set.seed(123)
TED.Com.fit <- ranger(TED.Com.tr.subs$cate~.,
data = TED.Com.tr.subs[2:10])
pred.Com.te <- predict(TED.Com.fit, TED.Com.te)
confusionMatrix(data=pred.Com.te$predictions, reference = TED.Com.te$cate)
## Confusion Matrix and Statistics
##
## Reference
## Prediction 1 2 3
## 1 95 1 3
## 2 5 75 2
## 3 12 5 95
##
## Overall Statistics
##
## Accuracy : 0.9044
## 95% CI : (0.8648, 0.9356)
## No Information Rate : 0.3823
## P-Value [Acc > NIR] : < 2e-16
##
## Kappa : 0.8559
##
## Mcnemar's Test P-Value : 0.02495
##
## Statistics by Class:
##
## Class: 1 Class: 2 Class: 3
## Sensitivity 0.8482 0.9259 0.9500
## Specificity 0.9779 0.9670 0.9119
## Pos Pred Value 0.9596 0.9146 0.8482
## Neg Pred Value 0.9124 0.9716 0.9724
## Prevalence 0.3823 0.2765 0.3413
## Detection Rate 0.3242 0.2560 0.3242
## Detection Prevalence 0.3379 0.2799 0.3823
## Balanced Accuracy 0.9131 0.9465 0.9310
We find that the performance of the model slightly improves. The accuracy is increased from 0.8805 to 0.9078. The balanced accuracy for class 1 is increased from 0.8897 to 0.9175, for class 2 is increased from 0.9317 to 0.9465, and for class 3 is increased from 0.9156 to 0.9335. Thus, we conclude that this model is the best performing model.
The results of sentiment analysis suggested that TED talks tend to present a positive sentiment, which may be intended to inspire and motivate audiences. On the other hand, we could observe that the results of sentiment analysis may not differentiate between different categories of documents or texts, so to some extent there is the lack of diversity in term of the videos’ sentiment.
Topic analysis was then used to cluster the TED videos and compare the results to the known categories provided by the TED website. The data visualization from the topic modeling study showed that the clusters generated by both Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) were closely aligned with the categories labeled by the TED website. Additionally, the performance of a supervised learning model that used the results of LSA as features was assessed, providing a more quantitative measure of the effectiveness of LSA in categorizing the texts.
Supervised learning techniques were applied to predict the category of TED videos using a random forest model. Four different sets of features were used for the prediction model: Latent Semantic Analysis (LSA) on Term Frequency (TF), LSA on Term Frequency-Inverse Document Frequency (TF-IDF), document embeddings, and a combination of LSA on TF-IDF and additional information such as the number of likes and views. The results showed that the last model had the highest accuracy, with an accuracy of 0.9078 and all balanced accuracies of over 0.90 for all categories.
One of the limitations of the data used in our analysis is the difficulty of obtaining a large amount of transcript data due to limitations in time and computer capacity. Additionally, the structure of the TED website is not conducive to scraping text, which hinders the richness and diversity of the data. Furthermore, the distribution of additional features on the TED website, such as information about the speakers and viewer comments, is fragmented, limiting the value of these features in our analysis.
In addition, we can expand our research based on the available information. For example, we can analyze trends in TED talk releases and predict their release schedule, which can help investors understand TED’s plans and the messages it intends to deliver to its audience. We can also analyze the variations in the wording of speeches given by the same speakers or on the same topics. Additionally, TED currently has nearly 400 options for categorizing videos, which may not be ideal for users trying to find a specific topic. Therefore, TED can improve the classification of videos by analyzing which categories viewers are more inclined to comment on in their feedback.